Monthly Archives: June 2009

Science Academies: renewable power tech ready for big growth

The US National Academies of Science has looked at the potential for renewable power in its home country, and determined that current solar and wind technologies could probably scale to supply 20 percent of our electricity. Beyond that, however, we’re going to need to fix the grid.

A number of renewable energy technologies are poised for significant growth. Wind turbine production is booked for several years, while several companies have reached the point where they’re able to produce a Gigawatt of capacity annually. Although the US has started from a small base, these power sources have grown at an annual rate of about 20 percent for most of the past decade, a period in which demand only grew about one percent annually. The US National Academies of science has now examined the prospects for continued growth, and sees no limits within the next decade and beyond, but, should growth continue, there are going to have to be significant changes to our national grid.

The report was prepared as part of the America’s Energy Future Project, which is supported by everyone from General Electric to the Kavli and Keck charitable foundations. It’s the second of several planned reports; the next one will target prospects for energy-efficient technology.

The report excludes hydropower, which is renewable, but constrained by the availability of appropriate water resources. At the moment, these other sources—geothermal, solar, biomass, and wind—account for about 2.5 percent of US electricity generating capacity, and estimates are that, under a business-as-usual scenario, they would reach eight percent by 2030. The report addresses the question of whether they’d be capable of scaling, should the US determine it wanted to increase reliance on these technologies (the total available solar and wind energy within the US, at 13.9 million TWh, dwarfs any reasonable future projections of demand). The authors limited their consideration of biomass use because they felt it was likely that the government would promote its use as a transportation fuel.

Source

Lung-on-a-chip could replace countless lab rats

“MICROLUNGS” grown from human tissue might one day help to replace the vast numbers of rats used to check the safety of drugs, cosmetics and other chemicals. The work is part of a growing drive to develop toxicology tests based on human cells as a replacement for animal testing.

Such efforts are made partly for ethical concerns, and partly because animal testing is so time-consuming and expensive. For example, the European Union’s REACH regulations require about 30,000 chemicals to be tested for toxicity over the next decade. Yet testing the effects of inhaling a single dose of a particular chemical typically requires more than 200 rats, while testing the chronic effects of breathing it in over time can take more than 3000. Meanwhile the EU Cosmetics Directive – which covers items from deodorants and perfume to air-fresheners – seeks to ban all tests of cosmetics on animals by 2013.

The obvious alternative is to test chemicals on human cells grown in the lab. The difficulty, however, lies in enticing those cells to form complex tissue that responds as our organs do.

Source

Cells Are Like Robust Computational Systems, Scientists Report

Gene regulatory networks in cell nuclei are similar to cloud computing networks, such as Google or Yahoo!, researchers report today in the online journal Molecular Systems Biology. The similarity is that each system keeps working despite the failure of individual components, whether they are master genes or computer processors.

This finding by an international team led by Carnegie Mellon University computational biologist Ziv Bar-Joseph helps explain not only the robustness of cells, but also some seemingly incongruent experimental results that have puzzled biologists.

“Similarities in the sequences of certain master genes allow them to back up each other to a degree we hadn’t appreciated,” said Bar-Joseph, an assistant professor of computer science and machine learning and a member of Carnegie Mellon’s Ray and Stephanie Lane Center for Computational Biology.

Between 5 and 10 percent of the genes in all living species are master genes that produce proteins called transcription factors that turn all other genes on or off. Many diseases are associated with mutations in one or several of these transcription factors. However, as the new study shows, if one of these genes is lost, other “parallel” master genes with similar sequences, called paralogs, often can replace it by turning on the same set of genes.

Source

Scientists invent 1.2nm molecular gear

Scientists from A*STAR’s Institute of Materials Research and Engineering (IMRE), led by Professor Christian Joachim, have scored a breakthrough in nanotechnology by becoming the first in the world to invent a molecular gear of the size of 1.2nm whose rotation can be deliberately controlled. This achievement marks a radical shift in the scientific progress of molecular machines and is published in Nature Materials, one of the most prestigious journals in materials science.

Said Prof Joachim, “Making a gear the size of a few atoms is one thing, but being able to deliberately control its motions and actions is something else altogether. What we’ve done at IMRE is to create a truly complete working gear that will be the fundamental piece in creating more complex molecular machines that are no bigger than a grain of sand.”

Prof Joachim and his team discovered that the way to successfully control the rotation of a single-molecule gear is via the optimization of molecular design, molecular manipulation and surface atomic chemistry. This was a breakthrough because before the team’s discovery, motions of molecular rotors and gears were random and typically consisted of a mix of rotation and lateral displacement. The scientists at IMRE solved this scientific conundrum by proving that the rotation of the molecule-gear could be wellcontrolled by manipulating the electrical connection between the molecule and the tip of a Scanning Tunnelling Microscope while it was pinned on an atom axis.

Opening Doors on the Way to a Personal Robot

Consider it one small step — or a roll, actually — for a robot, one not giant, but significant step for robotics.

Willow Garage, a Silicon Valley robotics research group, said that its experimental PR2 robot, which has wheels and can travel at speeds up to a mile and a quarter per hour, was able to open and pass through 10 doors and plug itself into 10 standard wall sockets in less than an hour. In a different test, the same robot completed a marathon in the company’s office, traveling 26.2 miles. PR2 will not compete with humans yet; it took more than four days.

For the person who wants to buy a fully functioning robot butler, this may not seem so impressive. But for roboticists and a new generation of technologists in Silicon Valley, this is a significant achievement, a step along the way to the personal robot industry.

Willow Garage was founded by Scott Hassan, one of the designers of the original Google search engine. The company’s name is a reference to a small garage on Willow Road in Menlo Park, Calif., which was Google’s first office. The company is trying to develop a new generation of robotic personal assistants. Roboticists here and at other companies envision creating something on the scale of the personal computer industry, with mechanical personal assistants taking over a lot of drudgery, from cleaning up to fetching a beer from the refrigerator.

Source

The future of robots is rat-shaped

If so, it will be time to scream… but out of joy, rather than fear, for it could be a turning point in the history of robotics.

Psikharpax — named after a cunning king of the , according to a tale attributed to Homer — is the brainchild of European researchers who believe it may push back a frontier in .

Scientists have strived for decades to make a robot that can do some more than make repetitive, programmed gestures. These are fine for making cars or amusing small children, but are of little help in the real world.

One of the biggest obstacles is learning ability. Without the smarts to figure out dangers and opportunities, a robot is helpless without human intervention.

“The autonomy of robots today is similar to that of an insect,” snorts Guillot, a researcher at France’s Institute for Intelligent Systems and Robotics (ISIR), one of the “Psikharpax” team.

Such failures mean it is time to change tack, argue some roboticist.

Source

Roll-Up Solar Panels

Xunlight, a startup in Toledo, Ohio, has developed a way to make large, flexible solar panels. It has developed a roll-to-roll manufacturing technique that forms thin-film amorphous silicon solar cells on thin sheets of stainless steel. Each solar module is about one meter wide and five and a half meters long.

As opposed to conventional silicon solar panels, which are bulky and rigid, these lightweight, flexible sheets could easily be integrated into roofs and building facades or on vehicles. Such systems could be more attractive than conventional solar panels and be incorporated more easily into irregular roof designs. They could also be rolled up and carried in a backpack, says the company’s cofounder and president, Xunming Deng. “You could take it with you and charge your laptop battery,” he says.

Amorphous silicon thin-film solar cells can be cheaper than conventional crystalline cells because they use a fraction of the material: the cells are 1 micrometer thick, as opposed to the 150-to-200-micrometer-thick silicon layers in crystalline solar cells. But they’re also notoriously inefficient. To boost their efficiency, Xunlight made triple-junction cells, which use three different materials–amorphous silicon, amorphous silicon germanium, and nanocrystalline silicon–each of which is tuned to capture the energy in different parts of the solar spectrum. (Conventional solar cells use one primary material, which only captures one part of the spectrum efficiently.)

Source

Making Fat Disappear

Can burning excess fat be as easy as exhaling? That’s the finding of a provocative new study by researchers at the University of California, Los Angeles (UCLA), who transplanted a fat-burning pathway used by bacteria and plants into mice. The genetic alterations enabled the animals to convert fat into carbon dioxide and remain lean while eating the equivalent of a fast-food diet.

The feat, detailed in the current issue of Cell Metabolism introduces a new approach to combating the growing obesity problem in humans. Although the proof-of-concept study is far from being tested in humans, it may point to new strategies for borrowing biological functions from bacteria and other species to improve human health.

To create the fat-burning mice, the researchers focused on a metabolic strategy used by some bacteria and plants called the glyoxylate shunt. James Liao, a biomolecular-engineering professor at UCLA and a senior author of the study, says, “This pathway is essential for the cell to convert fat to sugar” and is used when sugar is not readily available or to convert the fat stored in plant seeds into usable energy. Liao also says that it’s not known why mammals lack this particular strategy, although it may be because our bodies are designed to store fat rather than burn it.

Source

This Flight Sim Needs 120 Graphics Cards Just To Get Off The Ground

f16.jpg

Back when they were popular, flight sims needed some pretty hefty hardware to get them running. But I can’t remember any of them ever having “120 dedicated graphics cards” under the “required” section on the side of the box.

But the HD World does. A custom F-16 fighter simulator, it runs off 120 dual core PCs with 120 $400 graphics cards inside them, all chained together.

All that processing power gets you 10,000 “entities” on screen at once, realistic explosion and destruction effects and “20-40 visual acuity”, which is apparently as close to photo-realism as current projector technology can manage in a situation like this.

Oh, and it all comes wrapped in a 180-degree screen, along with a fully authentic replica of an F-16 cockpit.

If it didn’t cost millions and millions of dollars, I’d already have one on order. You can check out a clip of the sim in action below, courtesy of the Star Telegram.

Source

Building The Exascale Computer

What if we gave scientists machines that dwarf today’s most powerful supercomputers? What could they tell us about the nature of, say, a nuclear explosion? Indeed, what else could they discover about the world? This is the story of the quest for an exascale computer – and how it might change our lives.

What is exascale?

One exaflop is 1,000 times faster than a petaflop. The fastest computer in the world is currently the IBM-based Roadrunner, which is located in Los Alamos, New Mexico. Roadrunner runs at an astounding one petaflop, which equates to more than 1,000 trillion operations per second. The supercomputer has 129,600 processing cores and takes up more room than a small house, yet it’s still not quite fast enough to run some of the most intense global weather simulations, nuclear tests and brain modelling tasks that modern science demands. For example, the lab currently uses the processing power of Roadrunner to run complex visual cortex and cellular modelling experiments in almost real- time. In the next six months, the computer will be used for nuclear simulation and stockpile tests to make sure that the US nuclear weapon reserves are safe. However, when exascale calculations become a reality in the future, the lab could step up to running tests on ocean and atmosphere interactions. These are not currently possible because the data streams involved are simply too large. The move to exascale is therefore critical, because researchers require increasingly fast results from their experiments.

Source