Wednesday, November 26, 2008

Odontochelys Rendition

Plate Tectonics and Oceans in the Earlest of Epochs

A new picture of the early Earth is emerging, including the surprising finding that plate tectonics may have started more than 4 billion years ago — much earlier than scientists had believed, according to new research by UCLA geochemists reported Nov. 27 in the journal Nature.

"We are proposing that there was plate-tectonic activity in the first 500 million years of Earth's history," said geochemistry professor Mark Harrison, director of UCLA's Institute of Geophysics and Planetary Physics and co-author of the Nature paper. "We are reporting the first evidence of this phenomenon."

"Unlike the longstanding myth of a hellish, dry, desolate early Earth with no continents, it looks like as soon as the Earth formed, it fell into the same dynamic regime that continues today," Harrison said. "Plate tectonics was inevitable, life was inevitable. In the early Earth, there appear to have been oceans; there could have been life — completely contradictory to the cartoonish story we had been telling ourselves."

"We're revealing a new picture of what the early Earth might have looked like," said lead author Michelle Hopkins, a UCLA graduate student in Earth and space sciences. "In high school, we are taught to see the Earth as a red, hellish, molten-lava Earth. Now we're seeing a new picture, more like today, with continents, water, blue sky, blue ocean, much earlier than we thought."

The Earth is 4.5 billion years old. Some scientists think plate tectonics — the geological phenomenon involving the movement of huge crustal plates that make up the Earth's surface over the planet's molten interior — started 3.5 billion years ago, others that it began even more recently than that.

The research by Harrison, Hopkins and Craig Manning, a UCLA professor of geology and geochemistry, is based on their analysis of ancient mineral grains known as zircons found inside molten rocks, or magmas, from Western Australia that are about 3 billion years old. Zircons are heavy, durable minerals related to the synthetic cubic zirconium used for imitation diamonds and costume jewelry. The zircons studied in the Australian rocks are about twice the thickness of a human hair.

Hopkins analyzed the zircons with UCLA's high-resolution ion microprobe, an instrument that enables scientists to date and learn the exact composition of samples with enormous precision. The microprobe shoots a beam of ions, or charged atoms, at a sample, releasing from the sample its own ions, which are then analyzed in a mass spectrometer. Scientists can aim the beam of ions at specific microscopic areas of a sample and conduct a high-resolution isotope analysis of them without destroying the object.

"The microprobe is the perfect tool for determining the age of the zircons," Harrison said.

The analysis determined that some of the zircons found in the magmas were more than 4 billion years old. They were also found to have been formed in a region with heat flow far lower than the global average at that time.

"The global average heat flow in the Earth's first 500 million years was thought to be about 200 to 300 milliwatts per meter squared," Hopkins said. "Our zircons are indicating a heat flow of just 75 milliwatts per meter squared — the figure one would expect to find in subduction zones, where two plates converge, with one moving underneath the other."

"The data we are reporting are from zircons from between 4 billion and 4.2 billion years ago," Harrison said. "The evidence is indirect, but strong. We have assessed dozens of scenarios trying to imagine how to create magmas in a heat flow as low as we have found without plate tectonics, and nothing works; none of them explain the chemistry of the inclusions or the low melting temperature of the granites."

Evidence for water on Earth during the planet's first 500 million years is now overwhelming, according to Harrison.

"You don't have plate tectonics on a dry planet," he said.


Think we can say that any more forcefully?

It's a very interesting finding if confirmed. Consider then that the world had tectonics active within 500 my of its existence. How many different supercontinents arose and fell? How many different archipelagic continental configurations did we have? This world has been alien so many times over it seems.

To Osteoderm or Not Osteoderm?

With hard bony shells to shelter and protect them, turtles are unique and have long posed a mystery to scientists who wonder how such an elegant body structure came to be.

Since the age of dinosaurs, turtles have looked pretty much as they do now with their shells intact, and scientists lacked conclusive evidence to support competing evolutionary theories. Now with the discovery in China of the oldest known turtle fossil, estimated at 220- million-years-old, scientists have a clearer picture of how the turtle got its shell.

Working with colleagues in China and Canada, Olivier Rieppel, PhD, chairman of The Field Museum's department of geology, has analyzed the Chinese turtle fossil, finding evidence to support the notion that turtle shells are bony extensions of their backbones and ribs that expanded and grew together to form a hard protective covering.

The fossilized turtle ancestor, dubbed Odontochelys semitestacea (translation: half-shelled turtle with teeth), likely lived in the water rather than on land.

A report from Chun Li of the Institute of Vertebrate Paleontology and Paleoanthropology, Chinese Academy of Sciences in Beijing, and Xiao-Chun Wu of the Canadian Museum of Nature in Ottawa, along with Field's Rieppel, will appear in the journal Nature. Other co-authors include Li-Ting Wang of the Geological Survey of Guizhou Province in Guiyang, China, where the fossil was discovered and Li-Jun Zhao of the Zhejiang Museum of Nature History in Hangzhou, China.

Prior to discovery of Odontochelys, the oldest known turtle specimen was Proganochelys, which was found in Germany. Because Proganochelys has a fully-formed shell, it provides little information about how shells were formed. Odontochelys is older than Proganochelys and is helpful because it has only a partial shell, Rieppel said.

"This is the first turtle with an incomplete shell," Rieppel said. "The shell is an evolutionary innovation. It's difficult to explain how it evolved without an intermediate example."

Some contemporary reptiles such as crocodiles have skin with bony plates and this was also seen in ancient creatures such as dinosaurs. Some researchers theorized that turtle shells started as bony skin plates, called osteoderms, which eventually fused to form a hard shell.

There are problems with this idea, including studies of how shells form in turtle embryos as they develop within eggs, Rieppel said. Embryo studies show that the turtle backbones expand outward and the ribs broaden to meet and form a shell, he said.

While paleontologists take such studies into account, they aren't sufficient to prove how anatomy evolved over time, and evidence can be read in different ways. The limbs of Proganochelys, for example, show signs of bony plates in the skin.

But Odontochelys has no osteoderms and it has a partial shell extending from its backbone, Rieppel said. It also shows a widening of ribs. Although Odontochelys has only a partial shell protecting its back, it does have a fully formed plastron – complete protection of its underside – just as turtles do today.

This strongly suggests Odontochelys was a water dweller whose swimming exposed its underside to predators, Rieppel said. "Reptiles living on the land have their bellies close to the ground with little exposure to danger," he said.

Other arguments favor the notion that turtle shells evolved as extensions of the reptile's backbones and ribs, Rieppel said, but the partial shell of Odontochelys speaks very clearly.

"This animal tells people to forget about turtle ancestors covered with osteoderms," he said.


That's interesting because Bill Parker blogged about a recent fossil that indicated that osteoderms were the likely ancestral form of the turtle shell. Y'know, I have to wonder if turtles are actually paraphyletic. Could we be seeing multiple linages with convergence? Just a thought.

Tuesday, November 25, 2008

Post 2401: Parenting Question

I have an oddball question for the parents out there that read the blog. How many of you have ambidextrous kids? Avrora appears to be exactly that. I have concerns about screwing it up. Why? Because I was ambi when I was her age and up through kindergarten...and then the teachers made a concerted effort to get me to use only my right hand. Now, after years of using it that way, I'm pretty much right handed, but can still use the left better than average. Slightly.

So, advice? Anyone?


Monday, November 24, 2008

Juno to Jupiter!!!


I'm still grumbling over JIMO's cancellation though.

Quick Question

Has anyone ever done the calculation of just how much water has been lost from the Earth over the Phanerozoic? I mean through UV light slighting the water molecules up high and the hydrogen escaping because of its low mass and the Earth's warmth relative to gravity? I have heard discussions of this with respet to Venus, but not Earth...my google fu isn't so good today.

The Bean Finds the Rays!


Press release here. Universe Today is better and above. Cosmic rays are getting a lot press these days.

PS Go Sorta Divorced from Us Sister Lab LANL!

Huber on "A Hotter Greenhouse?"


Half of Earth's surface area is in the tropics, so changes and uncertainties in tropical temperatures dominate any climate sensitivity estimate. If SSTs were truly ~35°C at times in Tanzania (19°S) or New Jersey (~30°N), some tropical regions must have been much hotter. This has thought-provoking implications for paleoclimate, vegetation, and carbon cycle evolution.

First, tropical temperatures above 31°C offer no evidence for a climate thermostat, that is, a strict mechanism that maintains tropical SSTs in the modern range; climate dynamicists trying for decades to explain thermostats may have been chasing a chimera. Second, climate models might be able to reproduce warm poles and warm extratropical continental winters, given that these new tropical SSTs imply closer to modern temperature gradients (5).

Third, during the warmest parts of the past 65 million years--that is, the Paleocene-Eocene Thermal Maximum (PETM) and subsequent brief, sudden "hyperthermal" phases of the Early Eocene Climate Optimum (17)--tropical vegetation may have been above the upper limits of its thermal tolerance (18). Most plants, especially the C3 plants that comprised Eocene floras, have physiological mechanisms that break down in the 35° to 40°C range (18, 19); in particular, they can die because photorespiration dominates over photosynthesis (18-20). Annual mean temperatures greater than 35°C can be plausibly reconstructed to have been widespread equatorward of 35° latitude (8, 9, 21, 22), so floras may have been thermally stressed, and perhaps undergoing water stress in the warmest intervals.There is some evidence of tropical floral extinctions during the warmest periods (23, 24), while forests thrived at higher latitudes.

This scenario may be a missing link in the hypothesis (25) that carbon cycle and climate changes during the PETM were caused by oxidation of the terrestrial biosphere. It is well established that a major tropical vegetation die-off in a global warming world has profound temperature, precipitation, and carbon feedbacks (20). Carbon cycle modeling (26) suggests that the terrestrial carbon pool could have been much larger than modern, ~6000 gigatons of carbon. The gradual warming preceding the PETM may have loaded a terrestrial carbon storage gun, and crossing the 35°C threshold may have triggered it. Tropical die-back after an initial warming (22) might have added thousands of gigatons of carbon into the atmosphere and further increased temperatures by radically reducing evapotranspirative fluxes that normally cool tropical landmasses. Tropical heat death helps resolve two mysteries: the magnitude of the carbon and climate excursion at the PETM, and the fact that these abrupt warmings occur during broader intervals of extreme warmth, rather than in cold intervals as expected from methane degassing (27).

The recent results suggest that, rather than being a stable cradle for tropical life, the tropics may have been a crucible; during warming, many taxa may have been forced to flee poleward, innovate, or face extinction (28). These far-ranging implications are a lot to place on the narrow shoulders of the few published proxy records, but they highlight the importance of the next challenge: collecting more tropical multiproxy records and establishing the accuracy of existing ones.


Dr Matt Huber of Purdue is someone I have spoken to on occasion about paleoclimate matters. We started to talk about doing a collaboration, but it fell apart when we were both not able to sync up. Alas. That said, he was looking into, a couple years ago, doing some simulations on the Pliocene. The world at that point had a climate of 2.5 C warmer than now. IDK if Matt ever got around to doing the simulations or not, but it seems someone else now has an interest. Unfortunately, Scotese lacks a Pliocene climate map. Otherwise I'd point you there. Well, the Miocene may not be that far off.

That said, it appears that the tropics are a highly unstable region ecologically. During the Pleistocene, it was shattered into small forests all over the place. During the Eocene, if Matt's hypothesis is right, the tropics became too hot for C3 plants to survive. This idea that the tropics have been radically unstable over time is interesting because it means that it would explain why the tropics always have the highest extinction rates: their environment rather than being the most stable is the most volatile. And due to the fact that most tropical critters are not large and the fossil record biases towards the largest critters anyways, this may explain why there are some questions about the which area has the highest origination rates.

That said, if the tropics of tomorrow are still wet and the temperatures that much higher, could we see the first great tropical forests of grass and succulents? That'd be interesting!

Hey, didn't the succulents originate in the tropics? I seem to recall as a child hearing that and it being something of a surprise. Perhaps this is why. The succulents present there now are survivors from a time when the tropics were more arid.

Sunday, November 23, 2008

The Wikipedia Permian Artist Found?


Gorgon going to eat an Amphib and a double via of gorgons: with and without hair. Without hair seems more alien. With hair seems more menacing for some reason.

Friday, November 21, 2008

SpecFor RoboCopter


2,500 mile range with 300 lbs and a 24 endurance. The Special Forces are taking delivery of 20 of them in a joint program with DARPA. The Danger Room has a nice little run down on the status of the program.

IMNSHO, the recon helicopter that the US Army is having fits over ought to be something derived from the above: the initial cost of the Hummingbird was $50 million for the prototype versus $300 million for the canceled ARH-70. Cut out the weight of the potential passengers and you get something very close to what the Hummingbird can do. The Hummingbird also has a far, far greater range than the ARH-70 was to have: 2,500 miles vs a mere 162 miles.

Also, if there are any attempts at direct replacements for the Apache or Cobra, something like this, albeit with a much bigger payload, ought to be considered.

PS No comparisons to this, please:


I knew you were thinking it.

Faster Qubit Bit Flips

The promise of quantum computing is that it will dramatically outshine traditional computers in tackling certain key problems: searching large databases, factoring large numbers, creating uncrackable codes and simulating the atomic structure of materials.

A quantum step in that direction, if you'll pardon the pun, has been taken by Stanford researchers who announced their success in a paper published in the journal Nature. Working in the Ginzton Laboratory, they've employed ultrafast lasers to set a new speed record for the time it takes to rotate the spin of an individual electron and confirm the spin's new position.

Why does that matter? Existing computers, from laptops to supercomputers, see data as bits of information. Each bit can be either a zero or a one. But a quantum bit can be both zero and one at the same time, a situation known as a superposition state. This allows quantum computers to act like a massively parallel computer in some circumstances, solving problems that are almost impossible for classic computers to handle.

Quantum computing can be accomplished using a property of electrons known as "spin." A single unit of quantum information is the qubit, and can be constructed from a single electron spin, which in this experiment was confined within a nano-sized semiconductor known as a quantum dot.

An electron spin may be described as up or down (a variation of the usual zero and one) and may be manipulated from one state to another. The faster these electrons can be switched, the more quickly numbers can be crunched in a quantum fashion, with its intrinsic advantages over traditional computing designs.

The qubit in the Stanford experiment was manipulated and measured about 100 times faster than with previous techniques, said one of the researchers, David Press, a graduate student in applied physics.


Still awfully cloud. We need those puppies at higher temperatures if we're going to be using them in anything other than centralized installations. Everyday homes with cryogenics...that would be...different.

Directly Imaged Exoplanet Chart



Seen via Universe Today.

Beta Pictoris Planet Imaged?


The hot star Beta Pictoris is one of the best-known examples of stars surrounded by a dusty 'debris' disc. Debris discs are composed of dust resulting from collisions among larger bodies like planetary embryos or asteroids. They are a bigger version of the zodiacal dust in our Solar System. Its disc was the first to be imaged — as early as 1984 — and remains the best-studied system. Earlier observations showed a warp of the disc, a secondary inclined disc and infalling comets onto the star. "These are indirect, but tell-tale signs that strongly suggest the presence of a massive planet lying between 5 and 10 times the mean Earth-Sun distance from its host star," says team leader Anne-Marie Lagrange. "However, probing the very inner region of the disc, so close to the glowing star, is a most challenging task."

In 2003, the French team used the NAOS-CONICA instrument (or NACO [1]), mounted on one of the 8.2 m Unit Telescopes of ESO's Very Large Telescope (VLT), to benefit from both the high image quality provided by the Adaptive Optics system at infrared wavelengths and the good dynamics offered by the detector, in order to study the immediate surroundings of Beta Pictoris.

Recently, a member of the team re-analysed the data in a different way to seek the trace of a companion to the star. Infrared wavelengths are indeed very well suited for such searches. "For this, the real challenge is to identify and subtract as accurately as possible the bright stellar halo," explains Lagrange. "We were able to achieve this after a precise and drastic selection of the best images recorded during our observations."

The strategy proved very rewarding, as the astronomers were able to discern a feeble, point-like glow well inside the star's halo. To eliminate the possibility that this was an artefact and not a real object, a battery of tests was conducted and several members of the team, using three different methods, did the analysis independently, always with the same success. Moreover, the companion was also discovered in other data sets, further strengthening the team's conclusion: the companion is real.

"Our observations point to the presence of a giant planet, about 8 times as massive as Jupiter and with a projected distance from its star of about 8 times the Earth-Sun distance, which is about the distance of Saturn in our Solar System [2]," says Lagrange.

"We cannot yet rule out definitively, however, that the candidate companion could be a foreground or background object," cautions co-worker Gael Chauvin. "To eliminate this very small possibility, we will need to make new observations that confirm the nature of the discovery."

The team also dug into the archives of the Hubble Space Telescope but couldn't see anything, "while most possible foreground or background objects would have been detected", remarks another team member, David Ehrenreich.

The fact that the candidate companion lies in the plane of the disc also strongly implies that it is bound to the star and its proto-planetary disc.


Getting closer!

Thursday, November 20, 2008

Mars Science Laboratory Landing Sites to 4 Possible

October 1219: Pelagius Removed


I am currently reading The Crusades Through Arab Eyes by Amin Maalouf. It's an interesting take from the POV of the Arabs as to what happened in the Crusades, a rather different take. As with any history book, there are moments when you read it, it screams out what-if x had happened instead! In this particular case, there are several moments, but they are mostly covered by innumerable AH stories such as what-if Frederick Barbarossa hadn't drown, the Crusaders took Aleppo, Saladin buys it in the siege of Alexandria or whatnot. In this case, it deals with strange, strange tale of the Fifth Crusade.

To set the stage, the Fourth Crusade had done its dirty work and the Third Crusade which missed out on retaking the Holy Land despite some very interesting and insane battles with odd results. The POV from the book about Richard and Saladin's relationship is rather different than it is typically portrayed in Western literature on the subject. Saladin had basically won against the Crusaders and the Byzantines were smashed to not really recover (alas). There was a slight uptick for them, but...never to ever get close to a climb up to where they had been. Or could have been. The Crusaders States only held a strip of coastal land. A position that was extremely tenuous.

The Fifth Crusade was kicked off to try, once again, to recover the Holy Land in 1217. Maalouf attributes the start due to John of Brienne, King of Kingdom of Jerusalem (but not the city), besieging the Pope with letters to send another Crusade aroudn 1210. The Pope finally issue a bull is support of it in 1213. This Crusade was to be different though.

One of the huge differences was this Crusade was to be led by a Papal Legate, Pelagio Galvani, often called Pelagius: no kings were to be the leaders of this expedition. Another difference was that the Crusade went after what was seen as the source of strength that was causing the Crusaders to lose time and again the Holy Land: the wealth of Egypt. After an initial success in taking Damietta, he would make several strategic mistakes and lose everything. The Iberian Papal Legate made a huge mistake though: the ruler of Egypt and nephew of Saladin, al-Kamil, offered all of Palestine, including Jerusalem, from the river Jordan to the sea to the Crusaders and the return of the True Cross...if they'd just knock-off their attack in Egypt. Pelagius ignored the offer despite John of Brienne's entreaties.

Now, the offer was somewhat problematic. Palestine was in the hands of al-Kamil's brother, al-Mu'azam, the ruler of Syria and Palestine, and he was unlikely to want to surrender the territory, even if it created a buffer state between the brothers as has been conjectured. It is also likely that the offer was genuine though, because of al-Kamil's surrender of Jerusalem later in the Sixth Crusade to Frederick II without much of a fight.

Sooo...what-if?

In this case, Pelagius croaks at the end of a successful siege of Damietta and John of Brienne is left in charge. al-Kalim still offers Palestine. John takes the offer and the two ride out to clobber al-Mu'azam. From there, we get the reestablishment of the Kingdom of Jerusalem to its near maximum size. So, call the reestablishment a fact by November 1221 after a siege of Damascus. What happens next is pretty important.

I bet that there is a truce. I'd say one lasting around 9 years just for the round date. John's not going to want to give up expanding his Kingdom, but he's getting to be an old man, by the time the truce expires he'll be 60. If OTL is any guide, he'll only last another seven years. He'll pick a few fights, but since al-Kamil now controls most of Syria, he's somewhat boxed in: he'll not want to break the truce without help and he'll probably seek it though. If he attacks anywhere, I bet its Cyprus, but...I have the feeling he'll be too stuck to do it. He'll probably be involved in the politics of the Latin Empire of Constantinople as he did OTL.

As a precautionary move, we'll say he still marry off his daughters to strengthen his position. What he does with Yolande is critical. Technically, John isn't king. He's only regent until he can marry off Yolande whom is the real inheritor from John's wife. OTL, he married Yolande off to Frederick II...and lost his throne. What are the possible other ways he can secure the kingdom using his daughters' marriage and still keep his seat?

Well, I need to wrap this up. Let's up this up. What would John do to hold his throne? Would he, like so many Crusaders screw it all up? How much longer would the Kingdom of Jerusalem last then? Only to the end of the truce? Could he still enroll Frederick to try to take Egypt? Or...?

Thoughts?

Eileanchelys waldmani: Transitional Turtle

(image credit: BBC)
Fossils of the earliest known swimming turtles have been uncovered on an island in northwest Britain, scientists reported today.

The fossils of the previously unknown species suggest turtles first took to water during the Middle Jurassic period (180 to 160 million years ago).

Four crushed but intact skeletons had been found along with the remains of two other specimens, in a single slab of rock in 2004 on Scotland's Isle of Skye. Since then, researchers have painstakingly freed the fossils.

The fossils belonged to a pond turtle, Eileanchelys waldmani, which bridges the evolutionary gap between primitive land turtles and modern aquatic turtles.

Turtles first appeared in the Triassic period some 210 million years ago. They were exclusively land creatures, said study team member Jérémy Anquetin, a French Ph.D. student at the Natural History Museum in London.

These earliest turtles were heavy, lumbering creatures armed with thick shells and protective spikes, he said.

But the new fossil turtle had a domed, tortoise-like shell measuring up to 11.8 inches (30 centimeters) long and it was much more delicately built.

"It's light framed, just like an aquatic turtle," Anquetin said.

"Until the discovery of Eileanchelys, we thought that adaptation to aquatic habitat might have appeared among primitive turtles, but we had no fossil evidence of that," he added.

"Now we know for sure that there were aquatic turtles around 164 million years ago," Anquetin said.


Kewl! Callovian anapsida fossils! Woot! BBC also has a very nice science article on the subject. Whew. I was worried I'd only get HPC related posts out today.

Quantum Computers Do Chem Sim better?

Quantum computers would likely outperform conventional computers in simulating chemical reactions involving more than four atoms, according to scientists at Harvard University, the Massachusetts Institute of Technology, and Haverford College. Such improved ability to model and predict complex chemical reactions could revolutionize drug design and materials science, among other fields.

Writing in the Proceedings of the National Academy of Sciences, the researchers describe "software" that could simulate chemical reactions on quantum computers, an ultra-modern technology that relies on quantum mechanical phenomena, such as entanglement, interference, and superposition. Quantum computing has been heralded for its potential to solve certain types of problems that are impossible for conventional computers to crack.

"There is a fundamental problem with simulating quantum systems -- such as chemical reactions -- on conventional computers," says Alán Aspuru-Guzik, assistant professor of chemistry and chemical biology in Harvard's Faculty of Arts and Sciences. "As the size of a system grows, the computational resources required to simulate it grow exponentially. For example, it might take one day to simulate a reaction involving 10 atoms, two days for 11 atoms, four days for 12 atoms, eight days for 13 atoms, and so on. Before long, this would exhaust the world's computational power."

Unlike a conventional computer, Aspuru-Guzik and his colleagues say, a quantum computer could complete the steps necessary to simulate a chemical reaction in a time that doesn't increase exponentially with the reaction's complexity.

"Being able to predict the outcomes of chemical reactions would have tremendous practical applications," says Ivan Kassal, a graduate student in chemical physics at Harvard. "A lot of research in drug design, materials science, catalysis, and molecular biology is still done by trial and error. Having accurate predictions would change the way these types of science are done."

The researchers demonstrate in PNAS that quantum computers would need to attain a size of about 100 qubits -- which are to quantum computers as bits are to conventional computers -- to outperform current classical supercomputers at a chemical simulation.

"This is still far beyond current prototype quantum computers," Kassal says. "And although it might take millions of quantum elementary operations on a few hundred quantum bits, our work suggests that with quantum computers that are as fast as modern conventional computers, one could simulate in seconds a chemical reaction that would take a conventional computer years."



Interesting. In a way it makes sense. I'd have to defer to the computational chem geeks on their opinions, but if its a power of n problem it does make sense.

Huh. Sounds a little similar to a coLabbie's work. HA! Same guys!

SGI's Cell Phone Supercomputer Node


With 10,000 Processor Cores, Silicon Graphics Molecule 'Concept Computer' Uses Half the Power and Just 1.4 Percent of the Space of Comparable PC Cluster

Silicon Graphics, Inc. today offered a glimpse of the potential future of dense, power-efficient computing with the Silicon Graphics® Molecule™ concept computer.

Like a futuristic concept car points to potential innovations in transportation, Silicon Graphics Molecule is a concept computer that illustrates how the latest low-watt, multi-core consumer electronics technology, such as the Intel® Atom™ processor, can be combined with breakthrough Silicon Graphics® Kelvin™ cooling technology to pack more than 10,000 cores into a single rack.

Engineers at Silicon Graphics research labs developed the system to show how consumer electronics technologies and emerging marketplace trends might someday be applied to overcome the limits of today's high-throughput clusters. The Silicon Graphics Molecule concept computer balances processor speed, sustained memory bandwidth, and power consumption. The system has been shown to deliver sustained results on scientific and business problems from seismic processing to rendering and distributed searching.

Features of the Silicon Graphics Molecule concept computer include:

* High concurrency with 20,000 threads of execution — 40 times more than a single rack x86 cluster system
* High throughput with 15TB/sec of memory bandwidth per rack — over 20 times faster than a single rack x86 cluster system
* Greater balance with up to three times the memory bandwidth/OPS compared to current x86 CPUs
* High performance with approximately 3.5 times the computational performance per rack
* Greener with low-watt consumer CPUs and low-power memory that deliver 7 times better memory bandwidth/watt
* Innovative Silicon Graphics Kelvin cooling technology, which enables denser packaging by stabilizing thermal operations in densely configured solutions
* Operating environment flexibility, capable of running industry-standard Linux® implementations, with Microsoft® Windows® variants on some configurations

If someday brought to market, a single-rack system based on the Silicon Graphics Molecule concept computer would offer the computing power and memory bandwidth of more than 750 high-end PCs, yet it would consume less than half the power and less than 1.4 percent of the physical space.


Now that's closer to a HPC desktop. It would have to have a benchmark of 1.2 GFlops/core to make it onto the list. However, first off, it's only a concept computer. Secondly, this will make John at work very happy. Third: reliability. How often do I replace a CPU? Is it a 1% failure rate? What's the MBTF? Better be more than a billion hours: one million hours means there's a CPU failing every 4 days per node. I'd rather not be on the other side of the phone when a researcher calls irate that his 10 million core job had to get restarted again. Fourth, this still eats 40 kilowatts per node...minimum. hmmm. Big power plug. Better consumption per flop, but...

Okay. Mental exercise. I want 1024 nodes here (sorta a magic HPC number). I am going to assume a usefulflop rate of 500 Mflops/core and a gigabyte of memory per core (we really like 2, but...) That means this system would have 10,240,000 cores and more than 10 Petabytes of memory. (o.O) The useful sustained computational rate of 5 petaflops. However, we have a power requirement in excess of 41 megawatts just for the CPUs[1]. Not counting memory, chipset, spinning disk or interconnect. Ouch. MBTF of 1 billion hours? Means I am yanking a failed CPU every 4 days. Wait is the MBTF even close to that?

Looks interesting, but, again, just a concept uber'puter node. We'll see how this meshes with SGIs HPC plans.


1. *mutters* Heat Death of the Singularity.

Vast Martian Lower Latitude Glaciers


Vast Martian glaciers of water ice under protective blankets of rocky debris persist today at much lower latitudes than any ice previously identified on Mars, says new research using ground-penetrating radar on NASA's Mars Reconnaissance Orbiter.

Because water is one of the primary requirements for life as we know it, finding large new reservoirs of frozen water on Mars is an encouraging sign for scientists searching for life beyond Earth.

The concealed glaciers extend for tens of miles from edges of mountains or cliffs and are up to one-half mile thick. A layer of rocky debris covering the ice may have preserved the glaciers as remnants from an ice sheet covering middle latitudes during a past ice age.

"Altogether, these glaciers almost certainly represent the largest reservoir of water ice on Mars that's not in the polar caps. Just one of the features we examined is three times larger than the city of Los Angeles, and up to one-half-mile thick, and there are many more," said John W. Holt of The University of Texas at Austin's Jackson School of Geosciences, lead author of a report on the radar observations in the Nov. 21 issue of the journal Science.

"In addition to their scientific value, they could be a source of water to support future exploration of Mars," said Holt.

The gently sloping aprons of material around taller features have puzzled scientists since NASA's Viking orbiters revealed them in the 1970s. One theory contended they were flows of rocky debris lubricated by a little ice. The features reminded Holt of massive ice glaciers detected under rocky coverings in Antarctica, where he has extensive experience using airborne geophysical instruments such as radar to study Antarctic ice sheets.

The Shallow Radar instrument on the Mars Reconnaissance Orbiter provided an answer to this Martian puzzle, indicating the features contain large amounts of ice.

"These results are the smoking gun pointing to the presence of large amounts of water ice at these latitudes," said Ali Safaeinili, a shallow-radar instrument team member with NASA's Jet Propulsion Laboratory in Pasadena, Calif.

The radar's evidence for water ice comes in multiple ways. The radar echoes received by the orbiter while passing over these features indicate that radio waves pass through the apron material and reflect off a deeper surface below without significant loss in strength, as expected if the aprons are thick ice under a relatively thin covering.

The radar does not detect reflections from the interior of these deposits as would occur if they contained significant rock debris. Finally, the apparent velocity of radio waves passing through the apron is consistent with a composition of water ice.

[...]

The buried glaciers reported by Holt and 11 co-authors lie in the Hellas Basin region of Mars' southern hemisphere. The radar has also detected similar-appearing aprons extending from cliffs in the northern hemisphere.


Maps of the water ice regions near interesting geological locations make for probable points where future bases will be set up. SF writers, would-be mission planners and colonists: take note! Also see Universe Today.

*SCHMACK*

oday, scientific research is carried out on supercomputing clusters, a shared resource that consumes hundreds of kilowatts of power and costs millions of dollars to build and maintain. As a result, researchers must fight for time on these resources, slowing their work and delaying results. NVIDIA and its worldwide partners today announced the availability of the GPU-based Tesla™ Personal Supercomputer, which delivers the equivalent computing power of a cluster, at 1/100th of the price and in a form factor of a standard desktop workstation.

“We’ve all heard ‘desktop supercomputer’ claims in the past, but this time it’s for real,” said Burton Smith, Microsoft Technical Fellow. “NVIDIA and its partners will be delivering outstanding performance and broad applicability to the mainstream marketplace. Heterogeneous computing, where GPUs work in tandem with CPUs, is what makes such a breakthrough possible.”

Priced like a conventional PC workstation, yet delivering 250 times the processing power, researchers now have the horsepower to perform complex, data-intensive computations right at their desk, processing more data faster and cutting time to discovery.

“GPUs have evolved to the point where many real world applications are easily implemented on them and run significantly faster than on multi-core systems,” said Prof. Jack Dongarra, director of the Innovative Computing Laboratory at the University of Tennessee and author of LINPACK. “Future computing architectures will be hybrid systems with parallel-core GPUs working in tandem with multi-core CPUs."

Leading institutions including MIT, the Max Planck Institute, University of Illinois at Urbana-Champaign, Cambridge University, and others are already advancing their research using GPU-based personal supercomputers.

“GPU based systems enable us to run life science codes in minutes rather than the hours it took earlier. This exceptional speedup has the ability to accelerate the discovery of potentially life-saving anti-cancer drugs,” said Jack Collins, manager of scientific computing and program development at the Advanced Biomedical Computing Center in Frederick Md., operated by SAIC-Frederick, Inc.

At the core of the GPU-based Tesla Personal Supercomputer is the Tesla C1060 GPU Computing Processor which is based on the NVIDIA® CUDA™ parallel computing architecture. CUDA enables developers and researchers to harness the massively parallel computational power of Tesla through industry standard C.



BAD NVIDIA! NO COOKIE! Why?

From the press release of the Top500 of Nov 2008:

The entry level to the list moved up to the 12.64 Tflop/s mark on the Linpack benchmark, compared to 9.0 Tflop/s six months ago.


That means you still need to cluster those boxen to make the list. Being a supercomputer is a moving target. Now. Would I LOVE to rty to build a true HPC system from this? Oh yeah. Will I get to?

*punches in nvidia reps number*

*ring*ring*
*ring*ring*
*ring*ring*
...

*ring*ring*

*puts down phone*

Hrmph. They gotta have caller ID.

Just kidding. However, I really hate this HPC on a Desktop marketing BS everyone keeps doing. REALLY hate it.

Wednesday, November 19, 2008

Hans Reiser: The Mad Undead Programmer


Hans Reiser wants a trial do-over.

Reiser is the Linux guru who in April was convicted of the first-degree murder of his estranged wife. He's the same defendant who, in exchange for a 15-to-life term instead of a 25-to-life term, brought authorities to the Oakland hills where he buried Nina Reiser's body.

He even apologized for killing her.

But in a handwritten appellate motion, he is appealing his conviction. Yet there's a glaring problem with this appeal, in which he claims he thought the deal would have only sent him away for three years, not 15-to-life.

When he took the 15-to-life deal in August, he waived his right to appeal. And when entering the deal, he said he understood what he was doing and was represented by effective counsel. The appeal was first reported by the San Francisco Chronicle.

Perhaps Reiser is a little peeved that he turned down a pretrial deal last year with Alameda County Superior Court Judge Larry Goodman, in which the developer of the ReiserFS file system was offered three years if he pleaded guilty and disclosed where he hid 31-year-old Nina Reiser's body.

But the boyhood genius thought he could outsmart the jury, which grew tired of his hours on the witness stand attempting to explain away a myriad of coincidences linking him to his wife's murder -– days of testimony his attorney said was against his advice.

Now the 44-year-old Reiser says he thinks the latest deal was supposed to have netted three years. And he said his lead attorney, William DuBois, who he often butted heads with during trial, was out to get him.

Reiser wrote (.pdf) that he believed DuBois suffered from an excess of oxytocin.

"Persons with oxytocin excess enjoy betraying others," Reiser wrote.

Reiser added, "I believe that the attraction of duping and betraying me exceeded the attraction of duping and betraying a jury, due to my unique personal characteristics."

Reiser demands that DuBois' oxytocin levels be tested, and that if they are high, the courts should determine that Reiser was a victim of ineffective assistance of counsel and be granted a new trial.

DuBois was not immediately available for comment. Trial judge Larry Goodman denied (.pdf) Reiser's handwritten motion, which wasn't even made to a California appellate court.

Reiser called his lawyer delusional and claimed he picked jurors based on Chinese astrology. "It is a logical necessity that either I am delusional, or he is delusional."

Threat Level covered the trial gavel to gavel for six months and knows the answer to that question.

The convict is also demanding a polygraph test "to prove my counsel colluded with agents of the State of California to perpetuate numerous other frauds relating to the case."


There ya have it, folks. He really snapped. Never mind he waived his right to an appeal. He's Hans and therefore the world shall give way. *shakes head*

Holy Martian Cow

Woolly Mammoth Genome Sequenced


Scientists at Penn State are leaders of a team that is the first to report the genome-wide sequence of an extinct animal, according to Webb Miller, professor of biology and of computer science and engineering and one of the project's two leaders. The scientists sequenced the genome of the woolly mammoth, an extinct species of elephant that was adapted to living in the cold environment of the northern hemisphere. They sequenced four billion DNA bases using next-generation DNA-sequencing instruments and a novel approach that reads ancient DNA highly efficiently.

"Previous studies on extinct organisms have generated only small amounts of data," said Stephan C. Schuster, Penn State professor of biochemistry and molecular biology and the project's other leader. "Our dataset is 100 times more extensive than any other published dataset for an extinct species, demonstrating that ancient DNA studies can be brought up to the same level as modern genome projects."

The researchers suspect that the full woolly-mammoth genome is over four-billion DNA bases, which they believe is the size of the modern-day African elephant's genome. Although their dataset consists of more than four-billion DNA bases, only 3.3 billion of them -- a little over the size of the human genome -- currently can be assigned to the mammoth genome. Some of the remaining DNA bases may belong to the mammoth, but others could belong to other organisms, like bacteria and fungi, from the surrounding environment that had contaminated the sample. The team used a draft version of the African elephant's genome, which currently is being generated by scientists at the Broad Institute of MIT and Harvard, to distinguish those sequences that truly belong to the mammoth from possible contaminants.

"Only after the genome of the African elephant has been completed will we be able to make a final assessment about how much of the full woolly-mammoth genome we have sequenced," said Miller. The team plans to finish sequencing the woolly mammoth's genome when the project receives additional funding.

The team sequenced the mammoth's nuclear genome using DNA extracted from the hairs of a mammoth mummy that had been buried in the Siberian permafrost for 20,000 years and a second mammoth mummy that is at least 60,000-years-old. By using hair, the scientists avoided problems that have bedeviled the sequencing of ancient DNA from bones because DNA from bacteria and fungi, which always are associated with ancient DNA, can more easily be removed from hair than from bones. Another advantage of using hair is that less damage occurs to ancient DNA in hair because the hair shaft encases the remnant DNA like a biological plastic, thus protecting it from degradation and exposure to the elements.


Wow. Read the rest.

Russia's Reserves: How Long Will They Last?


Russia's finance minister sought Wednesday to reassure investors and citizens that the economy will survive the global financial turmoil, saying Russia's rainy day fund will last for at least 7 years under the worst-case scenario.

Despite a plunge in stock markets, oil revenues and the ruble, Alexei Kudrin said Russia's vast reserves — which have been accumulated in the 8-year-long oil boom — "have laid a solid foundation for a stable macroeconomy and the rate of the national currency."

Russian may tap the Reserve Fund for up to 500 billion rubles ($18 billion) next year to make up for declining budget revenues, Kudrin said.

If Russia's 3.5 trillion ruble ($127 billion) rainy day fund "won't be replenishing and Russia will be spending 500 billion rubles from it annually, it will last for at least 7 years." The fund may last up to 20 years, he added, but this will depend on the pace of economic growth.

National development bank VEB has already received 90 billion rubles ($3.3 billion) to support the plunging stock market and will get $3.2 billion in the next few months, Kudrin told the parliament.

[...]

The Central Bank's chairman Sergei Ignatyev told lawmakers that it had spent $57.5 billion from its foreign currency reserves in September and October to back the declining ruble during the financial crisis.

In addition to fluctuations in currency exchange rates, this has caused Russia's international reserves to fall by $97.6 billion during the period, Ignatyev told the lower chamber of the Russian parliament.

Pressed by plunging oil prices the Russian Central Bank last week loosened its "managed float" policy as it widened the ruble's trading corridor by 0.30 ruble, a move that fueled fears that the ruble is in for a big drop.

Russia's presidential aide Arkady Dvorkovich on Wednesday said again that the government would not let the national currency tumble.

"The Central Bank is in full control of the situation," Dvorkovich said in televised remarks. He admitted that lower oil prices may affect the ruble, but pledged that "there will be no devaluation".


hmmmm. Didn't an infamous President of a certain republic once say he'd defend his currency like a dog? Only to be yipped and barked at for yars afterwards when it uberdevalued anyways? Where do you find the stats for the Russian foreign currency accounts? I wonder how much they have for defending the ruble. $57G is a frak load of dinero. Anyways, Putin might want to wait before he does his Kremlin Comeback.

Tuesday, November 18, 2008

Another Crispy Hot Jupiter Found

A team of astronomers from Penn State and Nicolaus Copernicus University in Poland has discovered a new planet that is closely orbiting a red-giant star, HD 102272, which is much older than our own Sun. The planet has a mass that is nearly six times that of Jupiter, the largest planet in our solar system. The team includes Alexander Wolszczan, the discoverer of the first planets ever found outside our solar system, who is an Evan Pugh Professor of Astronomy and Astrophysics and the director of the Center for Exoplanets and Habitable Worlds at Penn State; and Andrzej Niedzielski, who leads his collaborators in Poland. The team suspects that a second planet may be orbiting HD 102272, as well. The findings, which will be published in a future issue of The Astrophysical Journal, shed light on the ways in which aging stars can influence nearby planets.

Scientists already know that stars expand as they age and that they eventually may gobble up adjacent planets. In fact, scientists expect our own planet to be swallowed up by the Sun in about a billion years. But what scientists don't yet understand fully is how aging stars influence nearby planets before they are destroyed. The team's newly discovered planet is interesting because it is located closer to a red-giant star than any other known planet.

"When red-giant stars expand, they tend to eat up the nearby planets, " said Wolszczan. "Although the planet we discovered conceivably could be closer to the star without being harmed by it, there appears to be a zone of avoidance around such stars of about 0.6 astronomical units, which is a little more than half of the distance from the Earth to the Sun. It is important to find out why planets don't want to get any closer to stars, so one of our next steps is to try to figure out why this zone of avoidance exists and whether it occurs around all red-giant stars."


Not much time to comment here.

New Cretaceous Bolivian Trackway Found


Bolivian farmer Primo Rivera had long wondered about the dents in a rocky hill near his home. Paleontologists solved the mystery this month: they are fossilized dinosaur footprints -- the oldest in Bolivia.

"I used to come to look at the prints when I was a kid ... but I didn't know what had made them," said Rivera, 35, who lives in the southern province of Chuquisaca.

The fossilized footsteps that intrigued Rivera for two decades are thought to be about 140 million years old, much older than other dinosaur prints found in the Andean country.

"The footprints we've found are important because they're the oldest ever found in Bolivia ... and the oldest footprints of Ankylosaurus ever found in the Southern Hemisphere," said Argentine paleontologist Sebastian Apesteguia in Buenos Aires.

Apesteguia, who led a two-week expedition sponsored by Chuquisaca's regional government, thinks the footprints belong to three different kinds of dinosaurs, including Ankylosaurus, an armored herbivore.

He said some of the prints were about 14 inches long, suggesting that the dinosaurs were "medium-sized ... about nine or 10 meters (about 30 feet) in length."

Close to the larger prints, the paleontologists found smaller ones that probably belonged to baby dinosaurs, indicating the offspring "were given some kind of care," Apesteguia said.


Dino tracks from SoAm from the Berriasian-Valangian (Early/Lower Cretaceous) Boundary. Interesting, interesting. They look like sauropod tracks, but its hard to tell for sure with those images. There are more through the link.

Monday, November 17, 2008

Martian Oceans of Antiquity

Top 500 List


Despite the many press releases to the contrary, Los Alamos (WOO!) continues to hold the top spot with their Roadrunner system (meet-meet!) over Oak Ridge's Jaguar, (note: egg->cpu, not a good pointer). We're on there with our Franklin system at #7: we did a lil upgrade. ;)

The Chinese are present in the top 10, too, btw! Running ... Windows. ew.

There's a small analysis here so you can get a look at the trends.

Oh, yeah, it's Supercomputing 08 in Austin. I'm not there, but there's a lot of news that comes out from this conference. Expect more this week from me.

Bill Kramer Moves to NCSA from NERSC


After 12 years at NERSC, Bill Kramer will be leaving his post as general manager to undertake a new position as Deputy Project Director for Blue Waters Project at the National Center for Supercomputing Applications (NCSA), in Urbana, Ill.

"LBNL and NERSC are very special. I have been at both longer by far than any place I have worked because of the mission to impact diverse science, the fantastic staff and commitment to the highest quality systems and services," says Kramer. "What I will miss most are the NERSC people. People make NERSC work, and I was fortunate to work with innovative and highly dedicated people."

During his tenure at the Berkeley Lab, Kramer saw NERSC through many major transitions, including a move from the Lawrence Livermore National Laboratory to Berkeley; a migration of the entire user community from vector supercomputers to highly parallel computing; and the design and implementation of both the NERSC system architecture and the NERSC service architecture.

This past year, Kramer played an integral role in managing the hardware upgrade of NERSC's Cray XT4 system, called Franklin, to quad-core processors, and setting up the procurement process for the NERSC-6 system, the next major supercomputer acquisition to support the Department of Energy Office of Science's computational challenges.

"I have always been attracted to places that are trying to do what no one else has. Over the past decade, NERSC has redefined what it means to be a supercomputer center," says Kramer.

"In his time at NERSC, Bill has successfully stood up some of the world's fastest machines and established the standard by which production computing centers are run," says Kathy Yelick, NERSC Division Director. "As I transitioned into the role of NERSC Director this past year, Bill's wealth of knowledge and experience was invaluable to me."

"I have worked together with Bill for almost 20 of the last 22 years, and have come to appreciate him as a great colleague: reliable, energetic, and always full of new ideas," says Horst Simon, Associate Laboratory Director (ALD) for Computing Sciences at Berkeley Lab. "I am disappointed to see Bill leave, but I am grateful for the many years where he shared his expertise and contributions. I wish him the best of luck in his future endeavor."

Originally from New York City, Kramer moved to Chicago, Ill. with his wife Laura, shortly after graduating from college. From his home in Illinois, Kramer commuted to Indiana every day to do computing work for a steel mill. He moved to California's Bay Area to put the world's first UNIX supercomputer into service as part of NASA Ames' Numerical Aerodynamic Simulation program.

"Laura and I have loved the Bay Area for a long time --- for the innovation, diversity, culture, and weather," says Kramer. "Since we lived in Illinois shortly after college, for our first jobs, we can't claim to be naive about the difference in weather. But every place we have lived, the Midwest, East and West, we have enjoyed and cherished for different reasons. We look forward to the university town life and making new friends, not to mention rooting for Purdue when they come to town."


Bill's an interesting guy and I am going to miss having him around. Very bright and a many that really, never sleeps. NCSA HPCers, be ready for someone that works as hard as he works you. We very frequently were getting messages at 3 am on what to do the next day...even when he was here in the Bay Area. The man just has boundless energy.

Bill: Good luck!

Sunday, November 16, 2008

Moschorhinus: A Therocephalian


To keep myself busy, since Lyuda and Avrora are off with her family right now, I am researching my next therapsid post on the therocephalians. There's someone out there doing some interesting drawings, I have to say.

Coal Fires: The Eternal Flame

Coal fires can occur naturally and are not a new phenomenon. Australia's Burning Mountain has smoldered for thousands of years. An underground coal fire in Centralia, Pa. , began in 1962, eventually opening sinkholes that threatened to gobble and incinerate pets and children. Centralia became a ghost town, and experts say that the fire there may burn for a century or more.

At the Rujigou coalfield in the Ningxia Autonomous Region of western China , fires have burned since the late Qing Dynasty (1644-1911). Legend has it that coal miners who were angry over not being paid started a coal fire more than a century ago.

"It was industrial revenge," Guan said.


Wow. I am really glad we dind't get a lot of that here in the States. I could see the alt-history book with title: Kentucky Burning.

Thursday, November 13, 2008

Hubble, Keck Directly View Gas Giant


The planet discovered by Hubble is one of the smallest exoplanets found yet. It's somewhere between the size of Neptune and three times bigger than Jupiter. And it may have a Saturn-like ring.

It circles the star Fomalhaut, pronounced FUM-al-HUT, which is Arabic for "mouth of the fish." It's in the constellation Piscis Austrinus and is relatively close by — a mere 148 trillion miles away, practically a next-door neighbor by galactic standards. The planet's temperature is around 260 degrees, but that's cool by comparison to other exoplanets.

The planet is only about 200 million years old, a baby compared to the more than 4 billion-year-old planets in our solar system. That's important to astronomers because they can study what Earth and planets in our solar system may have been like in their infancy, said Paul Kalas at the University of California, Berkeley. Kalas led the team using Hubble to discover Fomalhaut's planet.


I just talked with someone working on the instruments of the terrestrial planet hunt. I should have bet ya, Mike, on this one too.

BTW, taht sucker formed at the time of the Triassic-Jurassic Boundary. Ponder THAT.

NorAm Maniraptor Nest "Found"

t has all the hallmarks of a Cretaceous melodrama. A dinosaur sits on her nest of a dozen eggs on a sandy river beach. Water levels rise, and the mother is faced with a dilemma: Stay or abandon her unhatched offspring to the flood and scramble to safety?

Seventy-seven million years later, scientific detective work conducted by University of Calgary and Royal Tyrrell Museum researchers used this unique fossil nest and eggs to learn more about how nest building, brooding and eggs evolved. But there is a big unresolved question: Who was the egg-layer?

"Working out who the culprit was in this egg abandonment tragedy is a difficult problem to crack," says Darla Zelenitsky, U of C paleontologist and the lead author of a paper published today in the journal Palaeontology. "After further investigation, we discovered that this find is rarer than we first thought. It is a one of a kind fossil. In fact, it is the first nest of its kind in the world."

Zelenitsky says she first saw the nest in a private collection which had been collected in Montana in the 1990s. The nest was labeled as belonging to a hadrosaur (duck-billed) dinosaur, but she soon discovered it was mistakenly identified. In putting all the data together, she realized they had a small theropod (meat-eating) dinosaur nest. "Nests of small theropods are rare in North America and only those of the dinosaur Troodon have been identified previously," says Zelenitsky. "Based on characteristics of the eggs and nest, we know that the nest belonged to either a caenagnathid or a small raptor, both small meat-eating dinosaurs closely related to birds. Either way, it is the first nest known for these small dinosaurs."


I wonder if they will ever figure out which one...

Freakin TEASE!

Back in the late 60's the Polish-Mongolian expedition discovered and collected the giant arms (see cast above) of the enigmatic theropod, Deinocheirus, known only from these arms. Near the end of the 2008 KID expedition Philip Currie discovered the location of the near-fabled Deinocheirus quarry (below) and we spent part of a day reopening it. And we did collect more of the skeleton...


ARGH! TELL! TELL!

Wednesday, November 12, 2008

Anthrogenic Carbon Dioxide Has Averted the Worst Glacial Cycle Ever


Deep ice sheets would cover much of the Northern Hemisphere thousands of years from now—if it weren't for us pesky humans, a new study says.

Emissions of greenhouse gases—such as the carbon dioxide, or CO², that comes from power plants and cars—are heating the atmosphere to such an extent that the next ice age, predicted to be the deepest in millions of years, may be postponed indefinitely.

"Climate skeptics could look at this and say, CO² is good for us," said study leader Thomas Crowley of the University of Edinburgh in Scotland.

But the idea that global warming may be staving off an ice age is "not cause for relaxing, because we're actually moving into a highly unusual climate state," Crowley added.

In about 10,000 to 100,000 years, the study suggests, Antarctic-like "permanent" ice sheets would shroud much of Canada, Europe, and Asia.

"I think the present [carbon dioxide] levels are probably sufficient to prevent that from ever happening," said Crowley, whose study will appear tomorrow in the journal Nature.

For the past three million years, Earth's climate has wobbled through dozens of ice ages, with thick ice sheets growing from the poles and then shrinking back again.

These ice ages used to last roughly 41,000 years. But in the past half a million years, these big freezes each stretched to about a hundred thousand years long.

Meanwhile, the temperature swings during and between these ice ages became more extreme, soaring to new highs and lows.

The researchers found that between 10,000 and 100,000 years from now, Earth would enter into a period of permanent ice sheets—more severe than any seen in millions of years.

In some ways the ice age would be like those in the past few hundred thousand years, with a thick ice sheet covering North America, the study predicted.

But in the model, Europe and Asia also succumbed to ice sheets up to 2 miles (3.5 kilometers) thick, stretching from England to Siberia—something never before seen in models of past ice ages.

"We were surprised," Crowley said. "There's no evidence for this in Asia" during ice ages in the past few million years.


Just imagine the Sea Level Fall! QUICK! Buy the 'land rights' to the continental shelf! oh...ermm...

A bit more seriously, I have long felt that the interglacial-glacial cycle had not yet run its course and that calling us out of the Pleistocene a little premature. The Milkanovitch Cycles are certainly not over nor are the set conditions that prevented them from throwing the world into another ice age gone.

Or, at least, they wouldn't be without the industry generated carbon dioxide and other greenhouse gases (*cough*methane*cough*) were introduced to the atmosphere. The warming and the time period that will be necessary to sequester the amount of CO2 will probably put the next glacial cycle into the 100k+ year zone, if ever. We are doing major atmospheric engineering here whether we like it or not.

However, on a lighter note, should someone want to write a very different future, one that is near term happy, but long term not so good, this may be it. We fix the global warming problem, but X kiloyears in the future much of the planet is iced over. Not near term like Fallen Angels. bleh.

How Long Could You Survive Chained to a Bunk Bed with a Velociraptor?!

I could survive for 1 minute, 16 seconds chained to a bunk bed with a velociraptor

Created by Bunk Beds.net

OpenLab Worthy Posts?



I have five posts that are eligible for OpenLab 2008. IDK if they are really worthy or not. They're just essays on paleo topics intended to put out some information about certain topics that I find interesting and are not covered as much as I think they ought to be. I find the idea of submitting my own posts distasteful simply because it feels too much ... dirty ... and overly self promotional, but on the other hand, the post that I think really deserved to be included in it - Stop Dreaming - is ineligible due to it being a 2007 post. So I am going to ask my readers, are the below posts worth submitting?

Once Upon the Permian: Gazes of Fear (gorgonopsids)
Once Upon the Permian: Beaked Bites of a Lost Lineage (dicynodonts)
The Ecology of the Carbon Age (carboniferous carbon cycling)
The Caste Ecology of the Age of Carbon (plant and ecological distributions, caste-like)
Gasping for Paleo Air (oxygen levels through the Phanerozoic, especially the Mesozoic)

So, thoughts? Any of the above? None of them?

Update: One more. Were the Basal Archosaurs Endothermic?

New Kind of Aurora Found


On Saturn. By Cassini. With the infrared camera.

(looks like a rose too...)

Tuesday, November 11, 2008

More Wallacean Evolution Than Previously Thought?

A team of Princeton University scientists has discovered that chains of proteins found in most living organisms act like adaptive machines, possessing the ability to control their own evolution.

The research, which appears to offer evidence of a hidden mechanism guiding the way biological organisms respond to the forces of natural selection, provides a new perspective on evolution, the scientists said.

The researchers -- Raj Chakrabarti, Herschel Rabitz, Stacey Springs and George McLendon -- made the discovery while carrying out experiments on proteins constituting the electron transport chain (ETC), a biochemical network essential for metabolism. A mathematical analysis of the experiments showed that the proteins themselves acted to correct any imbalance imposed on them through artificial mutations and restored the chain to working order.

"The discovery answers an age-old question that has puzzled biologists since the time of Darwin: How can organisms be so exquisitely complex, if evolution is completely random, operating like a 'blind watchmaker'?" said Chakrabarti, an associate research scholar in the Department of Chemistry at Princeton. "Our new theory extends Darwin's model, demonstrating how organisms can subtly direct aspects of their own evolution to create order out of randomness."

The work also confirms an idea first floated in an 1858 essay by Alfred Wallace, who along with Charles Darwin co-discovered the theory of evolution. Wallace had suspected that certain systems undergoing natural selection can adjust their evolutionary course in a manner "exactly like that of the centrifugal governor of the steam engine, which checks and corrects any irregularities almost before they become evident." In Wallace's time, the steam engine operating with a centrifugal governor was one of the only examples of what is now referred to as feedback control. Examples abound, however, in modern technology, including cruise control in autos and thermostats in homes and offices.

The research, published in a recent edition of Physical Review Letters, provides corroborating data, Rabitz said, for Wallace's idea. "What we have found is that certain kinds of biological structures exist that are able to steer the process of evolution toward improved fitness," said Rabitz, the Charles Phelps Smyth '16 Professor of Chemistry. "The data just jumps off the page and implies we all have this wonderful piece of machinery inside that's responding optimally to evolutionary pressure."

The authors sought to identify the underlying cause for this self-correcting behavior in the observed protein chains. Standard evolutionary theory offered no clues. Applying the concepts of control theory, a body of knowledge that deals with the behavior of dynamical systems, the researchers concluded that this self-correcting behavior could only be possible if, during the early stages of evolution, the proteins had developed a self-regulating mechanism, analogous to a car's cruise control or a home's thermostat, allowing them to fine-tune and control their subsequent evolution. The scientists are working on formulating a new general theory based on this finding they are calling "evolutionary control."

The work is likely to provoke a considerable amount of thinking, according to Charles Smith, a historian of science at Western Kentucky University. "Systems thinking in evolutionary studies perhaps began with Alfred Wallace's likening of the action of natural selection to the governor on a steam engine --- that is, as a mechanism for removing the unfit and thereby keeping populations 'up to snuff' as environmental actors," Smith said. "Wallace never really came to grips with the positive feedback part of the cycle, however, and it is instructive that through optimal control theory Chakrabarti et al. can now suggest a coupling of causalities at the molecular level that extends Wallace's systems-oriented approach to this arena."


Whoa.

ummm.

Need those that are more familiar with this. Someone want to translate the paper?

Probably for Carlos: Forced Evolution?!

he study, which is available online and slated for publication in the journal Physical Review E, offers the most comprehensive mathematical analysis to date of the mechanisms that drive evolution in viruses and bacteria. Rather than focusing solely on random genetic mutations, as past analyses have, the study predicts exactly how evolution is affected by the exchange of entire genes and sets of genes.

"We wanted to focus more attention on the roles that recombination and horizontal gene transfer play in the evolution of viruses and bacteria," said bioengineer Michael Deem, the study's lead researcher. "So, we incorporated both into the leading models that are used to describe bacterial and viral evolution, and we derived exact solutions to the models."

The upshot is a newer, composite formula that more accurately captures what happens in real world evolution. Deem's co-authors on the study include Rice graduate student Enrique Muñoz and longtime collaborator Jeong-Man Park, a physicist at the Catholic University of Korea in Bucheon.

In describing the new model, Deem drew an analogy to thermodynamics and discussed how a geneticist or drug designer could use the new formula in much the same way that an engineer might use thermodynamics formulas.

"Some of the properties that describe water are density, pressure and temperature," said Deem. "If you know any two of them, then you can predict any other one using thermodynamics.

"That's what we're doing here," he said. "If you know the recombination rate, mutation rate and fitness function, our formula can analytically predict the properties of the system. So, if you have recombination at a certain frequency, I can say exactly how much that helps or hurts the fitness of the population."

Deem, Rice's John W. Cox Professor in Biochemical and Genetic Engineering and professor of physics and astronomy, said the new model helps to better describe the evolutionary processes that occur in the real world, and it could be useful for doctors, drug designers and others who study how diseases evolve and how our immune systems react to that evolution.

One idea that was proposed about five years ago is "lethal mutagenesis." In a nutshell, the idea is to design drugs that speed up the mutation rates of viruses and push them beyond a threshold called a "phase transition." The thermodynamic analogy for this transition is the freezing or melting of water -- which amounts to a physical transition between water's liquid and solid phases.

"Water goes from a liquid to a solid at zero degrees Celsius under standard pressure, and you can represent that mathematically using thermodynamics," Deem said. "In our model, there's also a phase transition. If the mutation, recombination or horizontal gene transfer rates are too high, the system delocalizes and gets spread all over sequence space."

Deem said the new results predict which parameter values will lead to this delocalization.


???

uuuuh. Looks interesting. For some reason sounds scary too. I Am Not A Biologist at this point in my life, soooo...

ESA ExoMars Rover Delayed Again


Dordain said, though, that the Martian rover ExoMars, which had previously been planned for launch in November 2013, would head to the Red Planet in early 2016.

Planning concerns made it preferable to aim for the next launch window for Mars, he said, insisting that there had not been budgetary problems.

The celestial ballet between Earth and Mars means that the distance between the two planets varies between 55 million kilometres (34 million miles) and more than 400 million kilometres.

The ExoMars mission entails sending a 200-kilogramme (440-pound) wheeled rover, which will carry a drill enabling it delve up to two metres (seven feet) below the surface to see if the Red Planet has microbial life, or the potential for it.

ExoMars was initially planned for launch in 2011, but this date had already slipped by two years to help resolve what its goals should be.


I hope this doesn't get delayed into oblivion.

VOTE BRIAN!

There's a scholarship of $10k for science bloggers. The internet folks get to decide who gets it based on a vote. Brian Switek of Laelaps has become a candidate and I would strongly suggest that people go vote for him. Please, go vote for him. Brian's a good guy and an excellent sci-blogger. It will only take a couple seconds of your time.

Thanx, folks.


A Little More Diverse Than We Thought


3 species!

DOE to Cut Funding to Elevated CO2 Forest Experiment

For more than a decade, the federal government has spent millions of dollars pumping elevated levels of carbon dioxide into small groups of trees to test how forests will respond to global warming in the next 50 years.

Some scientists believe they are on the cusp of receiving key results from the time-consuming experiments.

The U.S. Department of Energy, however, which is funding the project, has told the scientists to chop down the trees, collect the data and move on to new research. That plan has upset some researchers who have spent years trying to understand how forests may help stave off global warming, and who want to keep the project going for at least a couple of more years.

"There has been an investment in these experiments and it's a shame we are going to walk away from that investment," said William Chameides, an atmospheric scientist at Duke University, where one of the experimental forests is located. "There is no question that ultimately we want to cut the trees down and analyze the soil. The question is whether now is the time to do it."

Ronald Neilson, a U.S. Forest Service bio-climatologist in Corvallis, Ore., said the experiments should continue because they still have potential to answer key questions about how rainfall and fertility affect how much carbon a forest will store long-term — essential to understanding how forests may soften the blow of climate change.

But the Energy Department, following the advice of a specially convened panel of experts, believes that chopping down the trees and digging up the soil will allow the first real measurements of how much carbon the leaves, branches, trunks and roots have been storing, said J. Michael Kuperberg, a program manager with the agency.

Ending the experiments will also allow the funding to be devoted to new research that will look at the effects of higher temperatures, changes in rainfall, and variations in soil fertility, Kuperberg said.

"What we are trying to do here is balance the time to get optimal results out of the existing experiment with our desire for a new generation of experiments that we feel is more likely to realistically represent future climate scenarios," Kuperberg said.

Some scientists, though, believe ending the long-term research may be a mistake.

"If we stop these experiments now, it could cost many years to get back to this point, time we may not have," Kevin Lee Griffin, associate professor of environmental sciences at Columbia University, wrote in an e-mail.

The research program, Free Air CO2 Enrichment (FACE), consists of rings of tall white plastic pipes with holes along their length that emit once-liquified carbon dioxide in carefully metered doses. The loblolly pines planted in 1994 at Duke in North Carolina are located behind gates several miles from campus.

[...]

The Department of Energy's Office of Biological & Environmental Research has informed those managing the experiments that their current research will be phased out by 2011. They are to get the definitive measurements on how tree growth, which represents stored carbon, was influenced, and should design new experiments to get rolling by 2012.

The panel found that the current experiments had a useful life of 10 to 12 years, and in a few more years the results would become invalid, in part because the trees were nearly taller than the pipes delivering the carbon dioxide.

Results so far indicate that elevated levels of carbon dioxide make forests grow more quickly, said Ram Oren, associate professor of ecology at Duke University's Nicholas School of the Environment and Earth Sciences and principal investigator on the experiments there.

But unless forests are on fertile ground — hard to come by because of development — growth will be in leaves, needles, and fine roots, which die off and decompose in a year or two, releasing the carbon dioxide back to the atmosphere, Oren said.

[...]

Rich Norby, who oversees the tree experiment at Oak Ridge, said he had thought it had run its course, but emerging trends indicate the new wood growth from increased carbon dioxide tapers off due to limitations of nitrogen — fertilizer — in the soil.


There are pros and cons with this one. Comments to come. Busy today.


Fast one: carbon uptake is limited by soil quality. This means that we may be able to suck up some CO2 via reforestation, but it will hit a wall after a point due to other nutrient issues.