Archive for June, 2002

Tooth Phone

Thursday, 20 June, 2002, 11:44 GMT 12:44 UK

Put your mobile where your mouth is
That ringing in your ear could be your phone

Soon you could be swapping your mobile phone for a molar phone.
Royal College of Art students in London have developed a phone that fits inside a tooth.

The concept device picks up signals with a radio receiver and uses a tiny vibrating plate to convey them as sound along the jawbone to a person’s ear.

The designers said the mini-molar phone could be implanted in a tooth during routine dental surgery.

The prototype phone is the work of graduates James Auger and Jimmy Loizeau and forms part of the Royal College of Art’s annual summer exhibition.

Known as The Show, this exhibition shows off the best ideas of the current crop of RCA designers and students.

Bits and bites

Currently, the tooth phone is only a mock-up and lacks the communications chip to actually turn it into a functioning device.

Mr Auger said the technology to turn it into a working device already existed and it would be a simple matter to build the relevant chips into the gadget.

The designers speculate that, if the tooth phone becomes a working device, it could be used by stock traders to receive up-to-the-moment information about share prices or to help football managers communicate quickly with players during key matches.

However, the existing design is only supposed to help stimulate debate about future wearable computing devices and to help explore the social and cultural ramifications of in-body technology.

The tooth phone is on show at the Science Museum in London from the 21 June to November.

Development of the device was funded by the National Endowment for Science, Technology & the Arts as part of a collaboration between the Science Museum and the Royal College of Art.

Parole Hearing For Manson Follower

Parole Hearing For Manson Follower
Wednesday June 5, 2002 2:10 PM

SAN BERNARDINO, Calif. (AP) – A judge ordered a new parole hearing for former Charles Manson follower Leslie Van Houten, saying her good behavior behind bars should be considered.

Superior Court Judge Bob N. Krug also suggested Monday that the state Board of Prison Terms provide further guidance on other things she can do to earn her freedom.

The judge said last month that the board had not given specific reasons for denying parole to Van Houten, who was convicted of a double slaying in the 1960s.

The state board has rejected parole for Van Houten 13 times, most recently in June 2000. Board members said she could benefit from further therapy in prison.

A new parole review was scheduled for later this month, a spokesman for the state prison system said.

Van Houten’s lawyer, Christie Webb, declined to comment, saying she wanted to discuss the decision with her client first.

Van Houten, now 52, was a teen-ager when she was convicted in the slayings of Leno and Rosemary La Bianca. She was a part of the Charles Manson cult that also murdered pregnant actress Sharon Tate and four others in the summer of 1969 – one of California’s most notorious crimes.,1282,-1785197,00.html

Should the moon be developed?

Date:   06-11-02 00:54

The Moon Society, a nonprofit organization of astronomers, computer programmers and other scientists, advocate ‘large-scale industrialization and private enterprise’ on the moon.

Should the moon be developed?

Lunar golf courses, largescale industrialization under debate

By Jim Carlton

May 24 — A dispute over prohibiting development on the moon is causing rising tides of controversy on earth.

High school testing eyed for schizophrenia signs

High school testing eyed for schizophrenia signs
By Ellen Barry, Globe Staff, 5/25/2002

PHILADELPHIA – Hoping to head off the most debilitating of mental illnesses before it strikes, Yale University researchers are laying plans to search for a secret hidden in the brains of ninth-graders: In every group of 100 students, one will go on to develop schizophrenia.

For generations, schizophrenia has been diagnosed in late adolescence, after lives and relationships are already damaged by its painful early stages. In a scattering of research centers, including Yale’s, excitement is building around the possibility that doctors can spot ”pre-psychotic” symptoms and intervene in ways that could delay or weaken the onset of schizophrenia. Yale psychiatrists have been in talks with Connecticut schools to introduce a screen for high school freshmen.

But the idea of such early screening is contentious. Critics warn that it may be too early to identify people in the general population as being at risk for psychosis – both because prediction is still inexact and because there is no consensus on how to treat people who have not yet developed full-blown symptoms. Once a person is identified as at risk for schizophrenia, the most promising interventions – low-dose antipsychotic drugs – carry their own set of risks.

”We have to be cautious,” said Jim McNulty, president of the National Alliance for the Mentally Ill. ”We don’t know the long-term effects of medications on the human brain. It’s a trade-off.”

A Yale research center called Prevention through Risk Identification, Management and Education , or PRIME, is developing a possible student screen now, although Dr. Thomas McGlashan, PRIME’s chief investigator, said general screening was still some time in the future. ”We’re talking about a year from now” at the soonest, he said.

Schizophrenia, which afflicts 2.2 million Americans, tends to strike men in their late teens and early 20s and women slightly later, and rarely appears in older people. At the heart of the Yale plan is a tantalizing possibility: that early treatment with antipsychotics during that ”window of vulnerability” could protect them until the age when vulnerabililty lessens.

On Thursday, McGlashan presented hopeful new results at the American Psychiatric Association’s annual meeting in Philadelphia.

”There is evidence to suggest that intervention in this stage can have a preventive effect,” McGlashan said.

McGlashan is a year from the end of a clinical drug trial in which 60 patients thought to be at risk for schizophrenia are administered either sugar pills or the antipsychotic drug Zyprexa as a prophylactic measure. After eight weeks of treatment, the Zyprexa group had only half the level of psychotic symptoms as the placebo group, said researcher Scott Woods.

Those in the Zyprexa group also gained an average of 10 pounds in the eight-week period, contrasting to an average one-pound weight gain in the placebo group.

McGlashan has been under scrutiny for his research on pre-onset schizophrenia in the past, in large part because of the risks in giving powerful antipsychotic drugs to young people who are not diagnosed with any mental illness. Two years ago, he was cited by federal regulators for various ethical violations, including failing to fully inform participants of risks.

In the frustrating world of mental health care, early intervention has become a watchword, and Connecticut school officials have said they are eager to break ground by adding mental health to their roster of preventive health programs.

”We looked at it much like we look at eye screens and that sort of thing,” said Nancy Pugliese, who coordinates substance abuse prevention programs for the Connecticut public schools.

As part of that effort, Larry Davidson, a psychologist from the PRIME center, plans to begin an outreach program in the fall, teaching ninth-graders the early signs of psychosis, bipolar disorder, and other mental disorders. Both schools and parents must agree for students to be enrolled in the course, he said. When PRIME develops an accurate screen for pre-onset schizophrenia, it will be administered as part of the course, said Tandy Miller, the PRIME psychiatrist developing the screen.

Catching serious mental illness early is a wonderful opportunity, said Paul Appelbaum, chairman of the department of psychiatry at the University of Massachusetts and president-elect of the American Psychiatric Association.

”If we can actually intervene and try to prevent psychosis in half the people we’re treating, isn’t that a terrific accomplishment?” Appelbaum said.

The trouble, he said, is that even specialists are only right about half the time when they predict who is going to develop schizophrenia.

Patrick McGorry, who heads the Early Psychosis Prevention and Intervention Centre in Victoria, Australia, found that 40 percent of adolescents he identified as ”pre-psychotic” experienced the onset of schizophrenia within a year.

McGlashan’s early results – published in the May issue of the American Journal of Psychiatry – show that 54 percent of 13 patients identified as pre-psychotic had developed schizophrenia within a year.

Therein lies the problem for potential screening. For the subjects who are incorrectly identified as pre-psychotic, the identification itself could be life-changing, Appelbaum said.

”Does it impact on your ability to get health insurance?” he said. ”What about the self-stigma? You may begin to think of yourself as somebody who is going to be schizophrenic.

Davidson said that stigma will be reduced through the educational efforts PRIME plans to start in September, funded by a $99,000 grant from the National Alliance for Research of Schizophrenia and Depression. A similar educational effort in the classrooms and movie theaters of Norway had significant effects: the average delay from the first episode of psychosis to treatment went from 21/2 years to 2 months. The average delay before treatment in the United States is two years, largely because of a lack of awareness of the symptoms of mental illness, he said.

If Connecticut high school students were identified as at risk for psychosis, teenagers and their families would be carefully monitored, possibly at the PRIME clinic or a planned affiliate clinic in Hartford, McGlashan said.

But doctors can not confidently advise any preventive treatment at the moment. Although there are numerous experimental treatments being explored for pre-onset psychosis – such as antidepressants or cognitive behavioral therapy – no method has been broadly tested as a preventive measure.

”You don’t recommend any treatment unless you’ve got a thorough evaluation,” Davidson said. ”The state of knowledge right now is that you would not make” the recommendation to prescribe antipsychotics.

One observer said it is crucial that a broad screening tool not be used to nudge subjects toward an experimental treatment.

”The assumption always is that screening does no harm,” said Steven Hyman, who stepped down as the director of the National Institute of Mental Health to become Harvard’s provost. But screening for depression, a common disease for which treatment is reasonably safe and effective, is different from screening for a disease that is difficult to identify and treat.

”We don’t have proven ways of knowing who is schizophrenic, and the risk-benefit ratio of treating them prior to onset of serious symptoms is not established,” Hyman said.

Ellen Barry can be reached at

Economist magazine: Future of Mind Control

The future of mind control
May 23rd 2002
From The Economist print edition

People already worry about genetics. They should worry about brain science too

IN AN attempt to treat depression, neuroscientists once carried out a simple experiment. Using electrodes, they stimulated the brains of women in ways that caused pleasurable feelings. The subjects came to no harm—indeed their symptoms appeared to evaporate, at least temporarily—but they quickly fell in love with their experimenters.

Such a procedure (and there have been worse in the history of neuroscience) poses far more of a threat to human dignity and autonomy than does cloning. Cloning is the subject of fierce debate, with proposals for wholesale bans. Yet when it comes to neuroscience, no government or treaty stops anything. For decades, admittedly, no neuroscientist has been known to repeat the love experiment. A scientist who used a similar technique to create remote-controlled rats seemed not even to have entertained the possibility. “Humans? Who said anything about humans?” he said, in genuine shock, when questioned. “We work on rats.”

Ignoring a possibility does not, however, make it go away. If asked to guess which group of scientists is most likely to be responsible, one day, for overturning the essential nature of humanity, most people might suggest geneticists. In fact neurotechnology poses a greater threat—and also a more immediate one. Moreover, it is a challenge that is largely ignored by regulators and the public, who seem unduly obsessed by gruesome fantasies of genetic dystopias.

A person’s genetic make-up certainly has something important to do with his subsequent behaviour. But genes exert their effects through the brain. If you want to predict and control a person’s behaviour, the brain is the place to start. Over the course of the next decade, scientists may be able to predict, by examining a scan of a person’s brain, not only whether he will tend to mental sickness or health, but also whether he will tend to depression or violence. Neural implants may within a few years be able to increase intelligence or to speed up reflexes. Drug companies are hunting for molecules to assuage brain-related ills, from paralysis to shyness (see article).

A public debate over the ethical limits to such neuroscience is long overdue. It may be hard to shift public attention away from genetics, which has so clearly shown its sinister side in the past. The spectre of eugenics, which reached its culmination in Nazi Germany, haunts both politicians and public. The fear that the ability to monitor and select for desirable characteristics will lead to the subjugation of the undesirable—or the merely unfashionable—is well-founded.

Not so long ago neuroscientists, too, were guilty of victimising the mentally ill and the imprisoned in the name of science. Their sins are now largely forgotten, thanks in part to the intractable controversy over the moral status of embryos. Anti-abortion lobbyists, who find stem-cell research and cloning repugnant, keep the ethics of genetic technology high on the political agenda. But for all its importance, the quarrel over abortion and embryos distorts public discussion of bioethics; it is a wonder that people in the field can discuss anything else.

In fact, they hardly do. America’s National Institutes of Health has a hefty budget for studying the ethical, legal and social implications of genetics, but it earmarks nothing for the specific study of the ethics of neuroscience. The National Institute of Mental Health, one of its component bodies, has seen fit to finance a workshop on the ethical implications of “cyber-medicine”, yet it has not done the same to examine the social impact of drugs for “hyperactivity”, which 7% of American six- to eleven-year-olds now take. The Wellcome Trust, Britain’s main source of finance for the study of biomedical ethics, has a programme devoted to the ethics of brain research, but the number of projects is dwarfed by its parallel programme devoted to genetics.

Uncontrollable fears

The worriers have not spent these resources idly. Rather, they have produced the first widespread legislative and diplomatic efforts directed at containing scientific advance. The Council of Europe and the United Nations have declared human reproductive cloning a violation of human rights. The Senate is soon to vote on a bill that would send American scientists to prison for making cloned embryonic stem cells.

Yet neuroscientists have been left largely to their own devices, restrained only by standard codes of medical ethics and experimentation. This relative lack of regulation and oversight has produced a curious result. When it comes to the brain, society now regards the distinction between treatment and enhancement as essentially meaningless. Taking a drug such as Prozac when you are not clinically depressed used to be called cosmetic, or non-essential, and was therefore considered an improper use of medical technology. Now it is regarded as just about as cosmetic, and as non-essential, as birth control or orthodontics. American legislators are weighing the so-called parity issue—the argument that mental treatments deserve the same coverage in health-insurance plans as any other sort of drug. Where drugs to change personality traits were once seen as medicinal fripperies, or enhancements, they are now seen as entitlements.

This flexible attitude towards neurotechnology—use it if it might work, demand it if it does—is likely to extend to all sorts of other technologies that affect health and behaviour, both genetic and otherwise. Rather than resisting their advent, people are likely to begin clamouring for those that make themselves and their children healthier and happier.

This might be bad or it might be good. It is a question that public discussion ought to try to settle, perhaps with the help of a regulatory body such as the Human Fertilisation and Embryology Authority, which oversees embryo research in Britain. History teaches that worrying overmuch about technological change rarely stops it. Those who seek to halt genetics in its tracks may soon learn that lesson anew, as rogue scientists perform experiments in defiance of well-intended bans. But, if society is concerned about the pace and ethics of scientific advance, it should at least form a clearer picture of what is worth worrying about, and why.

Copyright © 2002 The Economist Newspaper and The Economist Group. All rights reserved.

US plan to strike enemy with Valium

US plan to strike enemy with Valium

Pentagon scientists aim for future battlefield victories with the aid of tranquillising drugs and GM bugs

Antony Barnett, public affairs editor
Sunday May 26, 2002
The Observer

American military chiefs are developing plans to use Valium as a potential weapon against enemy forces and to control hostile populations, according to official documents seen by The Observer.

The Pentagon has also asked scientists to evaluate proposals to use genetically modified bugs that ‘eat’ the enemy’s fuel and ammunition supplies without harming humans.

The development of these ‘non-lethal’ weapons angers campaigners who claim that they would breach international treaties on biological and chemical weapons.

US documents reveal that two years ago the Pentagon commissioned scientists at Pennsylvania State University to look at potential military uses for a range of chemicals known as calmatives. The scientists concluded that several drugs would be effective to control crowds or in military operations such as anti-terrorist campaigns. The drugs they recommended for ‘immediate consideration’ included diazepam, better known as the tranquilliser Valium, and dexmedetomidine, used to sedate patients in intensive care. The scientists advised that these drugs can ‘effectively act on central nervous system tissues and produces a less anxious, less aggressive, more tranquil-like behaviour’.

Other official documents reveal how genetically engineered micro-organisms to destroy equipment but not harm troops are also being considered by US military scientists as ‘non-lethal’ weapons. One proposal from the Office of Naval Research in Arlington, Virginia, proposes creating genetically modified bugs that would corrode roads and runways and produce ‘targeted deterioration of metal parts, coatings and lubricants of weapons vehicles and support equipment as well as fuels’.

This group of scientists has already patented micro-organisms that would decompose polyurethane, ‘a common component of paint for ships and aircraft’. Another proposal from a biotech laboratory at Brooks air force base in Texas was to modify ‘anti-material biocatalysts’ already under development. One of these breaks down fuels and plastics.

Most of the research was funded by Washington’s joint non-lethal weapons programme, in which Britain plays an active part. But further US documents, also seen by The Observer, reveal how a split has developed between the two nations, with British officials backing campaigners’ claims that using drugs such as Valium or other calmatives would be outlawed under the 1991 Chemical Weapons Convention. This protocol prohibits ‘any chemical which… can cause death, temporary incapacitation or permanent harm’.

A report of a meeting in the Ministry of Defence’s headquarters in London in November 2000 states: ‘The US and UK interpret the Chemical Weapons Convention (CWC) differently regarding riot control agents (RCA). The UK interpretation considers them to be chemical weapons under the CWC and thus proscribed; the US view is that they are not banned under that agreement. This could lead to difficulties in combined operations in certain circumstances, a situation compounded by the fact that the UK is a signatory to the European Convention of Human Rights, which further governs the use of NLW [non-lethal weapons].’

Some experts believe the use of genetically-modified microbes in military operations would breach the Biological and Toxin Weapons Convention.

Ed Hammond of the Sunshine Project – the US campaigners against biological and chemical weapons that obtained the documents – said: ‘What is absolutely shocking about these disclosures is that it represents either a massive institutional failure to implement US commitments under international treaties or it reflects an effort by some people in the Pentagon to undermine those treaties.’

A US military spokesman has denied that the Pentagon is developing ‘non-lethal’ biological or chemical weapons.

A spokesman from the Foreign Office said: ‘There are discussions between Britain and the US on all sorts of technical issues. But we both share a commitment to comply with all the international conventions governing chemical and biological weapons.’,6903,722395,00.html

Weather Wars by Jim Wilson


It is 2025. An enemy unknown to 20th-century Americans has massed its army at the border of a friendly country in a remote part of the world. High above them flies a single, unmanned stealth aircraft. A faint wisp of black dust sprays from its tail, spurring the creation of the only weapon capable of stopping the threatening horde.

The weapon the dust engenders is mud–old-fashioned, sink-up-to-your-knees, spin-your-tires mud. There’s nothing unusual about this slippery mixture of soil and water. It’s the same sloppy goo that forced the Roman legions to build Britain’s first real roads. What is different, in this futuristic scenario, is the way it’s delivered. Like a meal at a fancy Japanese restaurant, it is being created on the spot and to order. The “chef” is an isolated downpour that swirls only above the heads of the aggressors.

In much the same way that infrared and low-light viewing equipment has made it possible for 20th-century soldiers to own the night, U.S. Air Force planners hope to give 21st-century warriors advanced technologies that will enable them to own the weather. A declassified version of a 2-year study prepared by the Air War College and obtained by PM reveals that this is no dreamland scenario. The Pentagon’s top meteorologists believe the United States will be ready to fight–and win–a weather war early in the next century.

The study, titled “Weather As A Force Multiplier: Owning The Weather In 2025,” envisions future generals having at their disposal an impressive weather-control arsenal for tactical operations. These weapons would include unmanned stealth aircraft that could seed clouds above massing troops with fine particles of heat-absorbing carbon. This next-generation cloud-seeding technique would, in turn, produce localized flooding and create mud, which has been the bane of all of history’s armies. Airborne lasers would cause lightning to discharge over the airframes of attack and surveillance aircraft. Other lasers would fire at fog banks, clearing a temporary flight path to high-value targets, such as command posts. In addition, still more powerful microwave transmitters would heat the ionosphere, altering its reflective properties in ways that would disrupt communications among enemy field commanders.

To reach this future battlefield, the military is planning to piggyback on weather-prediction and weather-modification technologies being developed by the private sector. They estimate that by 2015 supercomputer and atmosphere-monitoring technologies will have advanced to the point where military planners will know exactly what sort of weather to expect over an operations area throughout the course of a campaign lasting several weeks.

The great leap forward, however, is expected to occur between 2015 and 2025, spurred on largely by a growing global population that will put increasing pressure on the worldwide food and drinkable water supplies. “These pressures [will] prompt governments and/or other organizations who are able to capitalize on the technological advances of the previous 20 years to pursue a highly accurate and reasonably precise weather-modification capability,” the report states.

“Our vision is that by 2025 the military could influence the weather on a mesoscale [theater-wide] or microscale [immediate local area] to achieve operational capabilities.”

The report makes the limitations of the military’s current weather-predicting abilities disturbingly clear: “During Operation Desert Storm, Gen. Buster C. Glosson asked his weather officer to tell him which targets would be clear in 48 hours for inclusion in the air tasking order (ATO). But current forecasting capability is only 85% accurate for no more than 24 hours, which doesn’t adequately meet the needs of the ATO planning cycle. Over 50% of the F-117 sorties weather aborted over their targets and A-10s only flew 75 of 200 scheduled close air support missions due to low cloud cover during the first two days of the campaign.”

If weather modification can actually turn the tide of battle remains an open question. The American military’s only acknowledged recent experience in using weather as a weapon occurred with Project Popeye, which began in 1966. The experiment’s objective was to extend the monsoon season, thereby increasing the amount of mud that formed on the Ho Chi Minh trail, a supply route that wound from what was then North Vietnam through Laos and Cambodia into South Vietnam. To produce the rain, a silver iodide rainmaking agent–dubbed “Olive Oil”–was dispersed from WC-130, F4 and A-1E aircraft into the clouds over the trail.

Positive results during the initial program led to its continued operation until 1972. But to this day, analysts remain divided over whether the rain created enough extra mud to significantly reduce the delivery of supplies. When you’re slogging through ankle-deep mud, another inch of it probably doesn’t make that much of a difference.