I cannot resist learning, seeking. Sic tam exigua temporis discere. Astronomy, Physics, Science, History, Art, Tech, Humor, Skepticism, Women, Lesbians. NSFW.
These were my main sources [x] [x] but I used information from various websites that I lost track of. I know there are more crises going on in the world so please feel free to add your own commentary to this post. If you disagree with any of the information show me your sources and I’ll edit it.
Requested by @summerseraglio, source [x]
Reblog every time.
from-dust-of-stars: It’s a mad, mad world we live in.
Sometimes it is overwhelming to attempt to comprehend the vast number of PEOPLE living in fear for their lives DAILY… So many more live under oppression and violence than we lucky few who do not. What a messed up world.
universal-abyss: Cool - quantum phenomena that appears to explain bird flock behavior. Damn fascinating!
How bird flocks are like liquid helium
By Marcus Woo 27 July 2014 1:00 pm
A flock of starlings flies as one, a spectacular display in which each bird flits about as if in a well-choreographed dance. Everyone seems to know exactly when and where to turn. Now, for the first time, researchers have measured how that knowledge moves through the flock—a behavior that mirrors certain quantum phenomena of liquid helium.
“This is one of the first studies that gets to the details of how groups move in unison,” says David Sumpter of Uppsala University in Sweden, who was not part of the study.
The remarkable accord with which starling flocks fly has long puzzled researchers and bird watchers alike. In the 1930s, the ornithologist Edmund Selous even suggested that the birds cooperate via telepathy. Researchers have since turned to more scientifically sound ideas, using mathematical models.
In the 1990s, physicist Tamás Vicsek of Eötvös Loránd University in Budapest came up with one of the more successful models, which is based on the principle that each bird flies in the same direction as its neighbors. If a bird angles right, the ones next to it will turn to stay aligned. Although this model reproduces many features well—how a flock swiftly aligns itself from a random arrangement, for example—a team of researchers from Italy and Argentina has now discovered that it doesn’t accurately describe in detail how flocks turn.
In their new study, the team, led by physicists Andrea Cavagna and Asja Jelic of the Institute for Complex Systems in Rome, used high-speed cameras to film starlings—which are common in Rome and form spectacular flocks—flying near a local train station. Using tracking software on the recorded video, the team could pinpoint when and where individuals decide to turn, information that enabled them to follow how the decision sweeps through the flock. The tracking data showed that the message to turn started from a handful of birds and swept through the flock at a constant speed between 20 and 40 meters per second. That means that for a group of 400 birds, it takes just a little more than a half-second for the whole flock to turn.
“It’s a real tour de force of measurement,” says Sriram Ramaswamy of the Tata Institute of Fundamental Research’s Centre for Interdisciplinary Sciences in Hyderabad, India, who wasn’t part of the research.
The fact that the information telling each bird to turn moves at a constant speed contradicts the Vicsek model, Cavagna says. That model predicts that the information dissipates, he explains. If it were correct, not all the birds would get the message to turn in time, and the flock wouldn’t be able to fly as one.
The team proposes that instead of copying the direction in which a neighbor flies, a bird copies how sharply a neighbor turns. The researchers derived a mathematical description of how a turn moves through the flock. They assumed each bird had a property called spin, similar to the spins of elementary particles in physics. By matching one another’s spin, the birds conserved the total spin of the flock. As a result of that conservation, the equations showed that the information telling birds to change direction travels through the flock at a constant speed—exactly as the researchers observed. It’s this constant speed that enables everyone to turn in near-unison, the team reports online today in Nature Physics.
The new model also predicts that information travels faster if the flock is well aligned—something else the team observed, Cavagna says. Other models don’t predict or explain that relationship. “This could be the evolutionary drive to have an ordered flock,” he says, because the birds would be able to maneuver more rapidly and elude potential predators, among other things.
Interestingly, Cavagna adds, the new model is mathematically identical to the equations that describe superfluid helium. When helium is cooled close to absolute zero, it becomes a liquid with no viscosity at all, as dictated by the laws of quantum physics. Every atom in the superfluid is in the same quantum state, exhibiting a cohesion that’s mathematically similar to a starling flock.
The similarities are an example of how deep principles in physics and math apply to many physical systems, Cavagna says. Indeed, the theory could apply to other types of group behavior, such as fish schools or assemblages of moving cells, Sumpter says.
Other models, such as the Vicsek model or others that treat the flock as a sort of fluid, probably still describe flock behavior over longer time and length scales, Ramaswamy says. But it’s notable that the new model, which is still based on relatively simple principles, can accurately reproduce behavior at shorter scales. “I think that’s cool,” he says. “That’s an achievement, really.”
Sumpter agrees. “It’s kind of reassuring we don’t need to think about the telepathic explanation,” he says.
Posted in Math, Physics, Plants & Animals
Mysterious molecules in space
PUBLIC RELEASE DATE: 29-Jul-2014
Contact: Jason Socrates Bardi American Institute of Physics
Researchers at Harvard-Smithsonian Center for Astrophysics finger silicon-capped hydrocarbons as possible source of mysterious ‘diffuse interstellar bands’
Pic: Caption: This graph shows absorption wavelength as a function of the number of carbon atoms in the silicon-terminated carbon chains SiC_(2n+1)H, for the extremely strong pi-pi electronic transitions. When the chain contains 13 or more carbon atoms - not significantly longer than carbon chains already known to exist in space - these strong transitions overlap with the spectral region occupied by the elusive diffuse interstellar bands. Credit: D. Kokkin, ASU. Usage Restrictions: This image may be used only with appropriate caption and credit.
WASHINGTON D.C., July 29, 2014 – Over the vast, empty reaches of interstellar space, countless small molecules tumble quietly though the cold vacuum. Forged in the fusion furnaces of ancient stars and ejected into space when those stars exploded, these lonely molecules account for a significant amount of all the carbon, hydrogen, silicon and other atoms in the universe. In fact, some 20 percent of all the carbon in the universe is thought to exist as some form of interstellar molecule.
Many astronomers hypothesize that these interstellar molecules are also responsible for an observed phenomenon on Earth known as the “diffuse interstellar bands,” spectrographic proof that something out there in the universe is absorbing certain distinct colors of light from stars before it reaches the Earth. But since we don’t know the exact chemical composition and atomic arrangements of these mysterious molecules, it remains unproven whether they are, in fact, responsible for the diffuse interstellar bands.
Now in a paper appearing this week in The Journal of Chemical Physics, from AIP Publishing, a group of scientists led by researchers at the Harvard-Smithsonian Center for Astrophysics in Cambridge, Mass. has offered a tantalizing new possibility: these mysterious molecules may be silicon-capped hydrocarbons like SiC3H, SiC4H and SiC5H, and they present data and theoretical arguments to back that hypothesis.
At the same time, the group cautions that history has shown that while many possibilities have been proposed as the source of diffuse interstellar bands, none has been proven definitively.
“There have been a number of explanations over the years, and they cover the gamut,” said Michael McCarthy a senior physicist at the Harvard-Smithsonian Center for Astrophysics who led the study.
Molecules in Space and How We Know They’re There
Astronomers have long known that interstellar molecules containing carbon atoms exist and that by their nature they will absorb light shining on them from stars and other luminous bodies. Because of this, a number of scientists have previously proposed that some type of interstellar molecules are the source of diffuse interstellar bands — the hundreds of dark absorption lines seen in color spectrograms taken from Earth.
In showing nothing, these dark bands reveal everything. The missing colors correspond to photons of given wavelengths that were absorbed as they travelled through the vast reaches of space before reaching us. More than that, if these photons were filtered by falling on space-based molecules, the wavelengths reveal the exact energies it took to excite the electronic structures of those absorbing molecules in a defined way.
Armed with that information, scientists here on Earth should be able to use spectroscopy to identify those interstellar molecules — by demonstrating which molecules in the laboratory have the same absorptive “fingerprints.” But despite decades of effort, the identity of the molecules that account for the diffuse interstellar bands remains a mystery. Nobody has been able to reproduce the exact same absorption spectra in laboratories here on Earth.
“Not a single one has been definitively assigned to a specific molecule,” said Neil Reilly, a former postdoctoral fellow at Harvard-Smithsonian Center for Astrophysics and a co-author of the new paper.
Now Reilly, McCarthy and their colleagues are pointing to an unusual set of molecules — silicon-terminated carbon chain radicals — as a possible source of these mysterious bands.
As they report in their new paper, the team first created silicon-containing carbon chains SiC3H, SiC4H and SiC5H in the laboratory using a jet-cooled silane-acetylene discharge. They then analyzed their spectra and carried out theoretical calculations to predict that longer chains in this family might account for some portion of the diffuse interstellar bands.
However, McCarthy cautioned that the work has not yet revealed the smoking gun source of the diffuse interstellar bands. In order to prove that these larger silicon capped hydrocarbon molecules are such a source, more work needs to be done in the laboratory to define the exact types of transitions these molecules undergo, and these would have to be directly related to astronomical observations. But the study provides a tantalizing possibility for finding the elusive source of some of the mystery absorption bands — and it reveals more of the rich molecular diversity of space.
“The interstellar medium is a fascinating environment,” McCarthy said. “Many of the things that are quite abundant there are really unknown on Earth.”
The article, “Optical Spectra of the Silicon-Terminated Carbon Chain Radicals SiCnH (n=3,4,5),” is authored by D. L. Kokkin, N. J. Reilly, R. C. Fortenberry, T. D. Crawford and M. C. McCarthy. It will be published in The Journal of Chemical Physics on July 29, 2014. After that date, it can be accessed at: http://scitation.aip.org/content/aip/journal/jcp/141/4/10.1063/1.4883521
Authors of the paper are affiliated with Harvard University, Arizona State University, Virginia Tech, the University of Louisville and Georgia Southern University.
ABOUT THE JOURNAL The Journal of Chemical Physics publishes concise and definitive reports of significant research in the methods and applications of chemical physics. See: http://jcp.aip.org
universal-abyss: Diffuse stellar bands explained by interstellar dust? More flames to the mystery. Damn, I just love a good mystery, don’t you?
The quantum Cheshire cat
PUBLIC RELEASE DATE: 29-Jul-2014
Contact: Florian Aigner Vienna University of Technology
Pic: Caption: The basic idea of the Quantum Cheshire Cat: In an interferometer, an object is separated from one if its properties — like a cat, moving on a different path than its own grin. Credit: TU Vienna / Leon Filter. Usage Restrictions: None.
Can neutrons be located at a different place than their own spin? A quantum experiment, carried out by a team of researchers from the Vienna University of Technology, demonstrates a new kind of quantum paradox
The Cheshire Cat featured in Lewis Caroll’s novel “Alice in Wonderland” is a remarkable creature: it disappears, leaving its grin behind. Can an object be separated from its properties? It is possible in the quantum world. In an experiment, neutrons travel along a different path than one of their properties – their magnetic moment. This “Quantum Cheshire Cat” could be used to make high precision measurements less sensitive to external perturbations.
At Different Places at Once
According to the law of quantum physics, particles can be in different physical states at the same time. If, for example, a beam of neutrons is divided into two beams using a silicon crystal, it can be shown that the individual neutrons do not have to decide which of the two possible paths they choose. Instead, they can travel along both paths at the same time in a quantum superposition.
“This experimental technique is called neutron interferometry”, says Professor Yuji Hasegawa from the Vienna University of Technology. “It was invented here at our institute in the 1970s, and it has turned out to be the perfect tool to investigate fundamental quantum mechanics.”
To see if the same technique could separate the properties of a particle from the particle itself, Yuji Hasegawa brought together a team including Tobis Denkmayr, Hermann Geppert and Stephan Sponar, together with Alexandre Matzkin from CNRS in France, Professor Jeff Tollaksen from Chapman University in California and Hartmut Lemmel from the Institut Laue-Langevin to develop a brand new quantum experiment.
The experiment was done at the neutron source at the Institut Laue-Langevin (ILL) in Grenoble, where a unique kind of measuring station is operated by the Viennese team, supported by Hartmut Lemmel from ILL.
Where is the Cat …?
Neutrons are not electrically charged, but they carry a magnetic moment. They have a magnetic direction, the neutron spin, which can be influenced by external magnetic fields.
First, a neutron beam is split into two parts in a neutron interferometer. Then the spins of the two beams are shifted into different directions: The upper neutron beam has a spin parallel to the neutrons’ trajectory, the spin of the lower beam points into the opposite direction. After the two beams have been recombined, only those neutrons are chosen, which have a spin parallel to their direction of motion. All the others are just ignored. “This is called postselection”, says Hermann Geppert. “The beam contains neutrons of both spin directions, but we only analyse part of the neutrons.”
These neutrons, which are found to have a spin parallel to its direction of motion, must clearly have travelled along the upper path - only there, the neutrons have this spin state. This can be shown in the experiment. If the lower beam is sent through a filter which absorbs some of the neutrons, then the number of the neutrons with spin parallel to their trajectory stays the same. If the upper beam is sent through a filter, than the number of these neutrons is reduced.
… and Where is the Grin?
Things get tricky, when the system is used to measure where the neutron spin is located: the spin can be slightly changed using a magnetic field. When the two beams are recombined appropriately, they can amplify or cancel each other. This is exactly what can be seen in the measurement, if the magnetic field is applied at the lower beam – but that is the path which the neutrons considered in the experiment are actually never supposed to take. A magnetic field applied to the upper beam, on the other hand, does not have any effect.
“By preparing the neurons in a special initial state and then postselecting another state, we can achieve a situation in which both the possible paths in the interferometer are important for the experiment, but in very different ways”, says Tobias Denkmayr. “Along one of the paths, the particles themselves couple to our measurement device, but only the other path is sensitive to magnetic spin coupling. The system behaves as if the particles were spatially separated from their properties.”
High Hopes for High-Precision Measurements
This counter intuitive effect is very interesting for high precision measurements, which are very often based on the principle of quantum interference. “When the quantum system has a property you want to measure and another property which makes the system prone to perturbations, the two can be separated using a Quantum Cheshire Cat, and possibly the perturbation can be minimized”, says Stephan Sponar.
The idea of the Quantum Cheshire Cat was first discovered by Prof. Yakir Aharonov and first published by Aharonov’s collaborator, Prof. Jeff Tollaksen (both now from Chapman University), in 2001. The measurements which have now been presented are the first experimental proof of this phenomenon. The experimental results have been published in the journal “Nature Communications”.
AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert! system.
universal-abyss: Another quantum thought experiment. So damned fascinating to ponder…
Worldwide water shortage by 2040
Date: July 29, 2014
Source: Aarhus University
Summary: Water is used around the world for the production of electricity, but new research results show that there will not be enough water in the world to meet demand by 2040 if the energy and power situation does not improve before then.
Water is used around the world for the production of electricity, but new research results show that there will not be enough water in the world to meet demand by 2040 if the energy and power situation does not improve before then.
Two new reports that focus on the global electricity water nexus have just been published. Three years of research show that by the year 2040 there will not be enough water in the world to quench the thirst of the world population and keep the current energy and power solutions going if we continue doing what we are doing today. It is a clash of competing necessities, between drinking water and energy demand. Behind the research is a group of researchers from Aarhus University in Denmark, Vermont Law School and CNA Corporation in the US.
In most countries, electricity is the biggest source of water consumption because the power plants need cooling cycles in order to function. The only energy systems that do not require cooling cycles are wind and solar systems, and therefore one of the primary recommendations issued by these researchers is to replace old power systems with more sustainable wind and solar systems.
The research has also yielded the surprising finding that most power systems do not even register how much water is being used to keep the systems going.
By 2020 the water issue affects 30-40% of the world
“It’s a huge problem that the electricity sector do not even realise how much water they actually consume. And together with the fact that we do not have unlimited water resources, it could lead to a serious crisis if nobody acts on it soon,” says Professor Benjamin Sovacool from Aarhus University.
Combining the new research results with projections about water shortage and the world population, it shows that by 2020 many areas of the world will no longer have access to clean drinking water. In fact, the results predict that by 2020 about 30-40% of the world will have water scarcity, and according to the researchers, climate change can make this even worse.
“This means that we’ll have to decide where we spend our water in the future. Do we want to spend it on keeping the power plants going or as drinking water? We don’t have enough water to do both,” says Professor Benjamin Sovacool.
How to solve the problem?
In the reports, the researchers emphasise six general recommendations for decision-makers to follow in order to stop this development and handle the crisis around the world: 1) Improve energy efficiency, 2) Better research on alternative cooling cycles, 3) Registering how much water power plants use, 4) Massive investments in wind energy, 5) Massive investments in solar energy, and 6) Abandon fossil fuel facilities in all water stressed places (which means half the planet).
Close up on France, the US, China and India
The team of researchers conducted their research focusing on four different case studies in France, the United States, China and India respectively. Rather than reviewing the situation on a national level, the team narrowed in and focused on specific utilities and energy suppliers. The first step was identifying the current energy needs, and then the researchers made projections as far as 2040, and most of the results were surprising. All four case studies project that it will be impossible to continue to produce electricity in this way and meet the water demand by 2040.
“If we keep doing business as usual, we are facing an insurmountable water shortage — even if water was free, because it’s not a matter of the price. There will no water by 2040 if we keep doing what we’re doing today. There’s no time to waste. We need to act now,” concludes Professor Benjamin Sovacool.
Story Source: The above story is based on materials provided by Aarhus University. The original article was written by Winnie Axelsen. Note: Materials may be edited for content and length.
Cite This Page: MLA APA Chicago: Aarhus University. “Worldwide water shortage by 2040.” ScienceDaily. ScienceDaily, 29 July 2014.
from-dust-of-stars: Some very sobering research points to serious water shortage potential in the very near future. And, for some good news, uh, let me get back to you on that.
Global warming amplifier: Rising water vapor in upper troposphere to intensify climate change
Date: July 28, 2014
Source: University of Miami Rosenstiel School of Marine & Atmospheric Science
Summary: A new study from scientists at the University of Miami Rosenstiel School of Marine and Atmospheric Science and colleagues confirms rising levels of water vapor in the upper troposphere — a key amplifier of global warming — will intensify climate change impacts over the next decades. The new study is the first to show that increased water vapor concentrations in the atmosphere are a direct result of human activities.
Pic: Illustration of annual mean T2-T12 field that provides a direct measure of the upper-tropospheric water vapor. Purple = dry and Red = moist. Credit: Eui-Seok Chung, Ph.D. Assistant Scientist - UM Rosenstiel School of Marine and Atmospheric Science
A new study from scientists at the University of Miami Rosenstiel School of Marine and Atmospheric Science and colleagues confirms rising levels of water vapor in the upper troposphere — a key amplifier of global warming — will intensify climate change impacts over the next decades. The new study is the first to show that increased water vapor concentrations in the atmosphere are a direct result of human activities.
“The study is the first to confirm that human activities have increased water vapor in the upper troposphere,” said Brian Soden, professor of atmospheric sciences at the UM Rosenstiel School and co-author of the study.
To investigate the potential causes of a 30-year moistening trend in the upper troposphere, a region 3-7 miles above Earth’s surface, Soden, UM Rosenstiel School researcher Eui-Seok Chung and colleagues measured water vapor in the upper troposphere collected by NOAA satellites and compared them to climate model predictions of water circulation between the ocean and atmosphere to determine whether observed changes in atmospheric water vapor could be explained by natural or human-made causes. Using the set of climate model experiments, the researchers showed that rising water vapor in the upper troposphere cannot be explained by natural forces, such as volcanoes and changes in solar activity, but can be explained by increased greenhouse gases, such as CO2.
Greenhouse gases raise temperatures by trapping Earth’s radiant heat inside the atmosphere. This warming also increases the accumulation of atmospheric water vapor, the most abundant greenhouse gas. The atmospheric moistening traps additional radiant heat and further increases temperatures.
Climate models predict that as the climate warms from the burning of fossil fuels, the concentrations of water vapor will also increase in response to that warming. This moistening of the atmosphere, in turn, absorbs more heat and further raises Earth’s temperature.
The paper, titled “Upper Tropospheric Moistening in response to Anthropogenic Warming,” was published in the July 28th, 2014 Early Addition on-line of the journal Proceedings of the National Academy of Sciences (PNAS). The paper’s authors include Chung, Soden, B.J. Sohn of Seoul National University, and Lei Shi of NOAA’s National Climatic Data Center in Ashville, North Carolina.
Story Source: The above story is based on materials provided by University of Miami Rosenstiel School of Marine & Atmospheric Science. Note: Materials may be edited for content and length.
Journal Reference: Eui-Seok Chung, Brian Soden, B. J. Sohn, and Lei Shi. Upper-tropospheric moistening in response to anthropogenic warming. Proceedings of the National Academy of Sciences, 2014; DOI: 10.1073/pnas.1409659111
Cite This Page: MLA APA Chicago: University of Miami Rosenstiel School of Marine & Atmospheric Science. “Global warming amplifier: Rising water vapor in upper troposphere to intensify climate change.” ScienceDaily. ScienceDaily, 28 July 2014.
from-dust-of-stars: Water Vapor confirmed as key amplifier of global warming. So, more fun weather in store, thanks, in most part to human hubris and sheer denial… Damn.
Virginia Is for Gay Lovers
EMMA GREEN JUL 29 2014, 7:58 AM ET
On Monday, a federal court ruled the state’s same-sex-marriage ban unconstitutional—the latest to be overturned. What makes the decision in the Old Dominion different?
“[T]he freedom to marry has long been recognized as one of the vital personal rights essential to the orderly pursuit of happiness by free men.”
This is a line from the famous 1967 Supreme Court case Loving v. Virginia, which established interracial couples’ right to marry. On Monday, the Fourth Circuit Court of Appeals recalled Loving in its ruling in Bostic v. Schaefer, which affirmed a lower court’s ruling striking down Virginia’s constitutional ban on same-sex marriage.
In his opinion for the 2-1 majority, Judge Henry F. Floyd delivered a strongly worded argument for the rights of same-sex couples. “We recognize that same-sex marriage makes some people deeply uncomfortable,” he wrote. “However, inertia and apprehension are not legitimate bases for denying same-sex couples due process and equal protection of the laws …. Denying same-sex couples this choice prohibits them from participating fully in our society, which is precisely the type of segregation that the Fourteenth Amendment cannot countenance.”
Judge Paul Niemeyer, who dissented in the Fourth Circuit’s decision, argued that the majority’s ruling failed to account for the history of the “right to marry” in America. “[S]ame-sex marriage is a new notion that has not been recognized for most of our country’s history,” he wrote. “In holding that same-sex marriage is encompassed by the traditional right to marry, the majority avoids the necessary constitutional analysis, concluding simply and broadly that the fundamental ‘right to marry’—by everyone and to anyone—may not be infringed.”
The question of “rights” is exactly what makes this decision significant, said Claire Guthrie Gastañaga, the executive director of the American Civil Liberties Union of Virginia. Unlike some other cases on same-sex-union laws, Bostic examines whether couples have a fundamental right to marriage. The judges applied strict scrutiny, the highest standard of legal review, under which the government has to show a compelling interest for limiting the plaintiffs’ ability to marry. “This court says very clearly: This is a fundamental right, and the government just didn’t meet their burden of explaining why there should be a [ban on] same-sex marriage,” Gastañaga said.
In his dissent, Niemeyer argued that applying strict scrutiny led to a decision that’s far too sweeping. “If the fundamental right to marriage is based on ‘the constitutional liberty to select the partner of one’s choice,’ as [the plaintiffs] contend, then that liberty would also extend to individuals seeking state recognition of other types of relationships that States currently restrict, such as polygamous or incestuous relationships,” he wrote. Perhaps the government’s interests would withstand challenges to these laws, he said, but “today’s decision would truly be a sweeping one if it could be understood to mean that individuals have a fundamental right to enter into a marriage with any person, or any people, of their choosing.”
This summer, courts have struck down same-sex marriage bans in other states, including Arkansas, Colorado, Florida, Idaho, Kentucky, Oklahoma, Pennsylvania, Utah, and Wisconsin. But same-sex-marriage advocates say the ruling in Virginia is distinctive in several ways.
For one thing, it’s the first-ever ruling on a class-action suit against same-sex marriage. The original plaintiffs were a lesbian couple and a gay couple—the case is named for one of the partners, Timothy Bostic. When the case reached the Fourth Circuit, these couples combined their case with “14,000 couples across Virginia who wanted to get married or who are married and want their marriage to get recognized,” Gastañaga said. If the ruling stands, it will automatically invalidate gay-marriage bans in the other states under the jurisdiction of the Fourth Circuit: North Carolina, South Carolina, and West Virginia.
“Virginia has a very clear pattern of repeated efforts by the legislature to ban same-sex couples from getting married.”
That might not happen for a while, though—if it happens at all. The ruling institutes a 21-day delay before implementation, during which time the defendants in the case may request a stay of the ruling. If a stay is granted, the ruling may not go into effect until the Supreme Court decides whether it wants to take up the issue of same-sex marriage in a future slate of cases.
John Eastman, chairman of the board of directors of the National Organization for Marriage, which opposes same-sex marriage, said this decision won’t change much in Virginia or elsewhere. “It’s not a major setback—it was not unexpected, given the panel and the raw politics that are playing out here. Everybody in the country fully expects that this thing is not going to be fully resolved until it makes its way to the Supreme Court.”
Gay marriage has historically been divisive in Virginia—the state has had an uncommonly contentious battle over the issue. “Virginia has a very clear pattern of repeated efforts by the legislature to ban same-sex couples from getting married,” said James Esseks, the director of the ACLU’s Lesbian, Gay, Bisexual, and Transgender Project. “There are few—if any—states in America that have a history of banging on that drum quite as many times as Virginia.”
This has sometimes shown up in subtle ways. This spring, married same-sex couples in the United States had to file federal taxes together for the first time following the Supreme Court’s ruling in United States v. Windsor. While many states accepted gay couples’ unions for tax purposes in order to simplify their filing process, Virginia didn’t, requiring same-sex couples to follow a six-page document of special guidelines and file multiple returns.
Same-sex marriage advocates are understandably eager to declare this a landmark civil rights case. “I think that it’s really quite appropriate that this decision today comes from Virginia,” Esseks said. “The foundation for the fundamental right to marry was created in Loving v. Virginia in 1967—Virginia is where the fundamental right to marry was born.”
And a lot has changed in 47 years: In Loving, the state ardently defended its ban on interracial marriage, while in Bostic, Virginia’s attorney general refused to oppose the plaintiffs in the case.
“It’s an utter abdication of duty, not defending a statue duly enacted by the people of the state of Virginia, and to do it on the grounds that there’s no argument to be made,” Eastman said. “That’s patently false.”
Esseks sees the attorney general’s decision in a more symbolic light. “It’s a big contrast—it shows that Virginia doesn’t want to get it wrong again,” he said.
from-dust-of-stars: Freaking awesome, yet another state concedes the legitimate legal right for same-sex marriage!
This is a beautifully logical statement, excerpted from above, that put it all nicely in perspective:
Judge Henry F. Floyd delivered a strongly worded argument for the rights of same-sex couples. “We recognize that same-sex marriage makes some people deeply uncomfortable,” he wrote. “However, inertia and apprehension are not legitimate bases for denying same-sex couples due process and equal protection of the laws …. Denying same-sex couples this choice prohibits them from participating fully in our society, which is precisely the type of segregation that the Fourteenth Amendment cannot countenance.”
Measuring the smallest magnets: Physicists measure magnetic interactions between single electrons
Date: July 28, 2014
Source: Weizmann Institute of Science
Summary: Imagine trying to measure a tennis ball that bounces wildly, every time to a distance a million times its own size. The bouncing obviously creates enormous “background noise” that interferes with the measurement. But if you attach the ball directly to a measuring device, so they bounce together, you can eliminate the noise problem. Physicists have used a similar trick to measure the interaction between the smallest possible magnets — two single electrons — after neutralizing magnetic noise that was a million times stronger than the signal they needed to detect.
Pic: An illustration showing the magnetic field lines of two electrons, arranged so that their spins point in opposite directions. Credit: Image courtesy of Weizmann Institute of Science
Imagine trying to measure a tennis ball that bounces wildly, every time to a distance a million times its own size. The bouncing obviously creates enormous “background noise” that interferes with the measurement. But if you attach the ball directly to a measuring device, so they bounce together, you can eliminate the noise problem.
As reported recently in Nature, physicists at the Weizmann Institute of Science used a similar trick to measure the interaction between the smallest possible magnets — two single electrons — after neutralizing magnetic noise that was a million times stronger than the signal they needed to detect.
Dr. Roee Ozeri of the Institute’s Physics of Complex Systems Department says: “The electron has spin, a form of orientation involving two opposing magnetic poles. In fact, it’s a tiny bar magnet.” The question is whether pairs of electrons act like regular bar magnets in which the opposite poles attract one another.
Dr. Shlomi Kotler performed the study while a graduate student under Dr. Ozeri’s guidance, with Drs. Nitzan Akerman, Nir Navon and Yinnon Glickman. Detecting the magnetic interaction of two electrons poses an enormous challenge: When the electrons are at a close range — as they normally are in an atomic orbit — forces other than the magnetic one prevail. On the other hand, if the electrons are pulled apart, the magnetic force becomes dominant, but so weak in absolute terms that it’s easily drowned out by ambient magnetic noise emanating from power lines, lab equipment and the earth’s magnetic field.
The scientists overcame the problem by borrowing a trick from quantum computing that protects quantum information from outside interference. This technique binds two electrons together so that their spins point in opposite directions. Thus, like the bouncing tennis ball attached to the measuring device, the combination of equal but opposite spins makes the electron pair impervious to magnetic noise.
The Weizmann scientists built an electric trap in which two electrons are bound to two strontium ions that are cooled close to absolute zero and separated by 2 micrometers (millionths of a meter). At this distance, which is astronomic by the standards of the quantum world, the magnetic interaction is very weak. But because the electron pairs were not affected by external magnetic noise, the interactions between them could be measured with great precision. The measurement lasted for 15 seconds — tens of thousands of times longer than the milliseconds during which scientists have until now been able to preserve quantum data.
The measurements showed that the electrons interacted magnetically just as two large magnets do: Their north poles repelled one another, rotating on their axes until their unlike poles drew near. This is in line with the predictions of the Standard Model, the currently accepted theory of matter. Also as predicted, the magnetic interaction weakened as a function of the distance between them to the power of three.
In addition to revealing a fundamental principle of particle physics, the measurement approach may prove useful in such areas as the development of atomic clocks or the study of quantum systems in a noisy environment.
Story Source: The above story is based on materials provided by Weizmann Institute of Science. Note: Materials may be edited for content and length.
Journal Reference: Shlomi Kotler, Nitzan Akerman, Nir Navon, Yinnon Glickman, Roee Ozeri. Measurement of the magnetic interaction between two bound electrons of two separate ions. Nature, 2014; 510 (7505): 376 DOI: 10.1038/nature13403
Cite This Page: MLA APA Chicago: Weizmann Institute of Science. “Measuring the smallest magnets: Physicists measure magnetic interactions between single electrons.” ScienceDaily. ScienceDaily, 28 July 2014.
universal-abyss: Ah, that moment of “revealing a fundamental principle of particle physics” - how thrilling!
Glow in space is evidence of a hot bubble in our galaxy
Date: July 28, 2014
Source: University of Miami
Summary: A recent study shows that the emission is dominated by the local hot bubble of gas — 1 million degrees — with, at most, 40 percent of emission originating within the solar system. The findings should put to rest the disagreement about the origin of the X-ray emission and confirm the existence of the local hot bubble.
When we look up to the heavens on a clear night, we see an immense dark sky with uncountable stars. With a small telescope we can also see galaxies, nebulae, and the disks of planets. If you look at the sky with an X-ray detector, you would see many of these same familiar objects; in addition, you would see the whole sky glowing brightly with X-rays. This glow is called the “diffuse X-ray background.”
While, at higher energies, the diffuse emission is due to point sources too far away and faint to be seen individually, the origins of the soft X-ray glow have been controversial, even 50 years after it was first discovered. The long-standing debate centers around whether the soft X-ray emission comes from outside our solar system, from a hot bubble of gas called the local hot bubble, or whether the emission comes from within the solar system, due to the solar wind colliding with diffuse gas.
New findings settle this controversy. A recent study shows that the emission is dominated by the local hot bubble of gas (1 million degrees), with, at most, 40 percent of emission originating within the solar system. The findings, published in the journal Nature, should put to rest the disagreement about the origin of the X-ray emission and confirm the existence of the local hot bubble.
“We now know that the emission comes from both sources, but is dominated by the local hot bubble,” said Massimiliano Galeazzi, professor and associate chair in the Department of Physics in the College of Arts and Sciences, at the University of Miami (UM) and principal investigator of the study. “This is a significant discovery. Specifically, the existence or nonexistence of the local bubble affects our understanding of the galaxy in the proximity to the Sun and can be used as foundation for future models of the Galaxy structure.”
Galeazzi, who led the investigation, and his collaborators from NASA, the University of Wisconsin-Madison, the University of Michigan, the University of Kansas, the Johns Hopkins University and CNES in France, launched a sounding rocket mission to analyze the diffuse X-ray emission, with the goal of identifying how much of that emission comes from within our solar system and how much from the local hot bubble.
“The DXL team is an extraordinary example of cross-disciplinary science, bringing together astrophysicists, planetary scientists, and heliophysicists,” said F. Scott Porter, astrophysicist at NASA’s Goddard Space Flight Center. “It’s unusual but very rewarding when scientists with such diverse interests come together to produce such groundbreaking results.”
The study measured the diffuse X-ray emission at low energy, what is referred to as the 1/4 keV band, corresponding to radiation with wavelength of the order of 5 nm.
“At that low energy, the light gets absorbed by the neutral gas in our galaxy, so the fact that we observe it means that the source must be ‘local,’ possibly within a few hundred light-years from Earth. However, until now it was unclear whether it comes from within the solar system (within few astronomical units from Earth), or a very hot bubble of gas in the solar neighborhood (hundreds of light-years from Earth). This is like traveling at night and seeing a light, not knowing if the light comes from 10 yards or 1,000 miles away,” Galeazzi said.
Interstellar bubbles are probably created by stellar winds and supernova explosions, which cast material outward, forming large cavities in the interstellar medium — the material that fills the space between the stars in a galaxy. Hot X-ray emitting gas can fill the bubble, if a second supernova occurs within the empty cavity.
X-ray emission also occurs within our solar system when the solar wind collides with interplanetary neutral gas. The solar wind is a stream of charged particles released, with great energy, from the atmosphere of the sun. They create a solar wind that travels vast distances, forming a region called the heliosphere. As these particles travel through space at supersonic speeds, they may collide with neutral hydrogen and helium that enters the solar system due to the motion of the sun in the galaxy, capturing an electron and emitting X-rays. This is called the solar wind charge exchange process.
The team refurbished and modernized an X-ray detector that was mounted on a sounding rocket. The X-ray detector was originally flown by the University of Wisconsin-Madison on multiple missions during the 1970s to map the soft X-ray sky. The current team, led by Galeazzi, rebuilt, tested, calibrated, and adapted the detectors to a modern NASA suborbital sounding rocket. Components from a 1993 Space Shuttle mission were also used. The mission, known as “The Diffuse X-ray emission from the Local Galaxy” (DXL) sounding rocket, aimed at separating and quantifying the X-ray emission from the two suspected sources: the local hot bubble and the solar wind charge exchange. This was the first mission designed for this kind of study.
“X-ray telescopes on satellites can observe for long periods of time and have reasonably large collecting areas, but very tiny fields of view, so they are very good for studying a small area in great detail,” said Dan McCammon, professor of Physics at the University of Wisconsin-Madison and one of the scientists who built the original instrument. “However, the observations for this experiment needed to look at a large part of the sky in a short time, to make sure the Solar wind did not change during the measurements. The sounding rocket could do it 4000 times faster.”
The rocket was launchedwith the support of NASA’s Wallops Flight Facility, from White Sands Missile Range in New Mexico, on December 12, 2012. It reached an altitude of 258 km (160 miles), and stayed above the Earth’s atmosphere for five minutes, enough time to carry out its mission successfully. The information collected was transmitted directly to researchers on the ground at the launch facility.
“The sounding rocket program allows us to conduct high-risk, high-payoff science quickly and inexpensively,” Porter said. “It is really one of NASA’s crown jewels.”
Galeazzi and collaborators are already planning the next launch. The next mission will be similar in design and goals, but will have multiple instruments to characterize the emission in more detail. The launch is currently planned for December, 2015.
The study was made possible with support from NASA. In addition, the collaborators in France received support from: Soleil Héliosphère Magnétosphère” of the French space agency CNES, and the National Program “Physique Chimie du Milieu Interstellaire” of the Institut national des sciences de l’univers, INSU.
Story Source: The above story is based on materials provided by University of Miami. Note: Materials may be edited for content and length.
Journal Reference: M. Galeazzi, M. Chiao, M. R. Collier, T. Cravens, D. Koutroumpa, K. D. Kuntz, R. Lallement, S. T. Lepri, D. McCammon, K. Morgan, F. S. Porter, I. P. Robertson, S. L. Snowden, N. E. Thomas, Y. Uprety, E. Ursino, B. M. Walsh. The origin of the local 1/4-keV X-ray flux in both charge exchange and a hot bubble. Nature, 2014; DOI: 10.1038/nature13525
Cite This Page: MLA APA Chicago: University of Miami. “Glow in space is evidence of a hot bubble in our galaxy.” ScienceDaily. ScienceDaily, 28 July 2014.
Pic: For more colorful example, I chose the “Bubble Nebula or NGC 7635 is 10 light-year diameter object a mere 11,000 light-years away toward the constellation Cassiopeia near a giant molecular cloud”, copied from the http://www.dailygalaxy.com
universal-abyss: Our very own local hot bubble - so cool!
universal-abyss: ‘Invisibility Cloaking system on, please’ - a step towards the ultimate camouflage? How incredibly awesome is this?!
Building ‘invisible’ materials with light
Date: July 28, 2014
Source: University of Cambridge
Summary: A new technique which uses light like a needle to thread long chains of particles could help bring sci-fi concepts such as cloaking devices one step closer to reality.
Pic: This image depicts an efficient route to manufacturing nanomaterials with light through plasmon-induced laser-threading of gold nanoparticle strings. Credit: Ventsislav Valev
A new method of building materials using light, developed by researchers at the University of Cambridge, could one day enable technologies that are often considered the realm of science fiction, such as invisibility cloaks and cloaking devices.
Although cloaked starships won’t be a reality for quite some time, the technique which researchers have developed for constructing materials with building blocks a few billionths of a metre across can be used to control the way that light flies through them, and works on large chunks all at once. Details are published today (28 July) in the journal Nature Communications.
The key to any sort of ‘invisibility’ effect lies in the way light interacts with a material. When light hits a surface, it is either absorbed or reflected, which is what enables us to see objects. However, by engineering materials at the nanoscale, it is possible to produce ‘metamaterials’: materials which can control the way in which light interacts with them. Light reflected by a metamaterial is refracted in the ‘wrong’ way, potentially rendering objects invisible, or making them appear as something else.
Metamaterials have a wide range of potential applications, including sensing and improving military stealth technology. However, before cloaking devices can become reality on a larger scale, researchers must determine how to make the right materials at the nanoscale, and using light is now shown to be an enormous help in such nano-construction.
The technique developed by the Cambridge team involves using unfocused laser light as billions of needles, stitching gold nanoparticles together into long strings, directly in water for the first time. These strings can then be stacked into layers one on top of the other, similar to Lego bricks. The method makes it possible to produce materials in much higher quantities than can be made through current techniques.
In order to make the strings, the researchers first used barrel-shaped molecules called cucurbiturils (CBs). The CBs act like miniature spacers, enabling a very high degree of control over the spacing between the nanoparticles, locking them in place.
In order to connect them electrically, the researchers needed to build a bridge between the nanoparticles. Conventional welding techniques would not be effective, as they cause the particles to melt. “It’s about finding a way to control that bridge between the nanoparticles,” said Dr Ventsislav Valev of the University’s Cavendish Laboratory, one of the authors of the paper. “Joining a few nanoparticles together is fine, but scaling that up is challenging.”
The key to controlling the bridges lies in the cucurbiturils: the precise spacing between the nanoparticles allows much more control over the process. When the laser is focused on the strings of particles in their CB scaffolds, it produces plasmons: ripples of electrons at the surfaces of conducting metals. These skipping electrons concentrate the light energy on atoms at the surface and join them to form bridges between the nanoparticles. Using ultrafast lasers results in billions of these bridges forming in rapid succession, threading the nanoparticles into long strings, which can be monitored in real time.
"We have controlled the dimensions in a way that hasn’t been possible before," said Dr Valev, who worked with researchers from the Department of Chemistry and the Department of Materials Science & Metallurgy on the project. "This level of control opens up a wide range of potential practical applications."
Story Source: The above story is based on materials provided by University of Cambridge. The original story is licensed under a Creative Commons Licence. Note: Materials may be edited for content and length.
Journal Reference: Lars O. Herrmann, Ventsislav K. Valev, Christos Tserkezis, Jonathan S. Barnard, Setu Kasera, Oren A. Scherman, Javier Aizpurua, Jeremy J. Baumberg. Threading plasmonic nanoparticle strings with light. Nature Communications, 2014; 5 DOI: 10.1038/ncomms5568
Cite This Page: MLA APA Chicago: University of Cambridge. “Building ‘invisible’ materials with light.” ScienceDaily. ScienceDaily, 28 July 2014.
Physicists unlock nature of high-temperature superconductivity
Date: July 28, 2014
Source: University of Illinois at Chicago
Summary: Physicists have identified the ‘quantum glue’ that underlies a promising type of superconductivity — a crucial step towards the creation of energy superhighways that conduct electricity without current loss.
Physicists have identified the “quantum glue” that underlies a promising type of superconductivity — a crucial step towards the creation of energy superhighways that conduct electricity without current loss.
The research, published online in the Proceedings of the National Academy of Sciences, is a collaboration between theoretical physicists led by Dirk Morr, professor of physics at the University of Illinois at Chicago, and experimentalists led by Seamus J.C. Davis of Cornell University and Brookhaven National Laboratory.
The earliest superconducting materials required operating temperatures near absolute zero, or −459.67 degrees Fahrenheit. Newer unconventional or “high-temperature” superconductors function at slightly elevated temperatures and seemed to work differently from the first materials. Scientists hoped this difference hinted at the possibility of superconductors that could work at room temperature and be used to create energy superhighways.
Superconductivity arises when two electrons in a material become bound together, forming what is called a Cooper pair. Groundbreaking experiments performed by Freek Massee and Milan Allan in Davis’s group were analyzed using a new theoretical framework developed at UIC by Morr and graduate student John Van Dyke, who is first author on the report. Their results pointed to magnetism as the force underlying the superconductivity in an unconventional superconductor consisting of cerium, cobalt and indium, with the molecular formula CeCoIn5.
“For a long time, we were unable to develop a detailed theoretical understanding of this unconventional superconductor,” said Morr, who is principal investigator on the study. Two crucial insights into the complex electronic structure of CeCoIn5 were missing, he said: the relation between the momentum and energy of electrons moving through the material, and the ‘quantum glue’ that binds the electrons into a Cooper pair.
Those questions were answered after the Davis group developed high-precision measurements of CeCoIn5 using a scanning tunneling spectroscopy technique called quasi-particle interference spectroscopy. Analysis of the spectra using a novel theoretical framework developed by Morr and Van Dyke allowed the researchers to extract the missing pieces of the puzzle.
The new insight allowed them to explore the 30-year-old hypothesis that the quantum glue of superconductivity is the magnetic force.
Magnetism is highly directional, Morr said.
“Knowing the directional dependence of the quantum glue, we were able, for the first time, to quantitatively predict the material’s superconducting properties using a series of mathematical equations,” he said.
“Our calculations showed that the gap possesses what’s called a d-wave symmetry, implying that for certain directions the electrons were bound together very strongly, while they were not bound at all for other directions,” Morr said. Directional dependence is one of the hallmarks of unconventional superconductors.
“We concluded that magnetism is the quantum glue underlying the emergence of unconventional superconductivity in CeCoIn5.”
The finding has “lifted the fog of complexity” surrounding the material, Morr said, and was only made possible by “the close collaboration of theory and experiment, which is so crucial in advancing our understanding of complex systems.”
“We now have an excellent starting point to explore how superconductivity works in other complex material,” Morr said. “With a working theory, we can now investigate how we have to tweak the system to raise the critical temperature — ideally, all the way up to room temperature.”
Story Source: The above story is based on materials provided by University of Illinois at Chicago. The original article was written by Jeanne Galatzer-Levy. Note: Materials may be edited for content and length.
Journal Reference: J. S. Van Dyke, F. Massee, M. P. Allan, J. C. S. Davis, C. Petrovic, D. K. Morr. Direct evidence for a magnetic f-electron-mediated pairing mechanism of heavy-fermion superconductivity in CeCoIn5. Proceedings of the National Academy of Sciences, 2014; DOI: 10.1073/pnas.1409444111
Cite This Page: MLA APA Chicago: University of Illinois at Chicago. “Physicists unlock nature of high-temperature superconductivity.” ScienceDaily. ScienceDaily, 28 July 2014.
from-dust-of-stars: Damn, the ‘quantum glue’ that may lead us to energy conducted without loss of current. Freaking exciting as hell.
How sweet it is: Bioenergy advanced by new tool
Date: July 28, 2014
Source: DOE/Lawrence Berkeley National Laboratory
Summary: Researchers have developed a powerful new tool that can help advance the genetic engineering of ‘fuel’ crops for clean, green and renewable bioenergy — an assay that enables scientists to identify and characterize the function of nucleotide sugar transporters, critical components in the biosynthesis of plant cell walls.
PIC: A family of six nucleotide sugar transporters never before described have been characterized in Arabidopsis, a model plant for research in advanced biofuels. Credit: Roy Kaltschmidt
A powerful new tool that can help advance the genetic engineering of “fuel” crops for clean, green and renewable bioenergy, has been developed by researchers with the U.S. Department of Energy (DOE)’s Joint BioEnergy Institute (JBEI), a multi-institutional partnership led by Lawrence Berkeley National Laboratory (Berkeley Lab). The JBEI researchers have developed an assay that enables scientists to identify and characterize the function of nucleotide sugar transporters, critical components in the biosynthesis of plant cell walls.
“Our unique assay enabled us to analyze nucleotide sugar transporter activities in Arabidopsis and characterize a family of six nucleotide sugar transporters that has never before been described,” says Henrik Scheller, the leader of JBEI’s Feedstocks Division and a leading authority on cell wall biosynthesis. “Our method should enable rapid progress to be made in determining the functional role of nucleotide sugar transporters in plants and other organisms, which is very important for the metabolic engineering of cell walls.”
Scheller is the corresponding author, along with Ariel Orellana at the Universidad Andrés Bello, Santiago, Chile, of a paper describing this research in the Proceedings of the National Academy of Sciences (PNAS). The paper is titled “The Golgi localized bifunctional UDP-rhamnose/UDP-galactose transporter family of Arabidopsis.” The lead authors are Carsten Rautengarten and Berit Ebert, both of whom hold appointments with JBEI, and both of whom, like Scheller, also hold appointments with Berkeley Lab’s Physical Biosciences Division. (See below for the full list of co-authors.)
The sugars in plant biomass represent an enormous potential source of environmentally benign energy if they can be converted into transportation fuels — gasoline, diesel and jet fuel — in a manner that is economically competitive with petroleum-based fuels. One of the keys to success in this effort will be to engineer fuel crops whose cells walls have been optimized for sugar content.
With the exception of cellulose and callose, the complex polysaccharide sugars in plant cell walls are synthesized in the Golgi apparatus by enzymes called glycosyltransferases. These polysaccharides are assembled from substrates of simple nucleotide sugars which are transported into the Golgi apparatus from the cytosol, the gel-like liquid that fills a plant cell’s cytoplasm. Despite their importance, few plant nucleotide sugar transporters have been functionally characterized at the molecular level. A big part of the holdup has been a lack of substrates that are necessary to carry out such characterizations.
“Substrates of mammalian nucleotide sugar transporters are commercially available because of the medical interest but have not been available for plants, which made it difficult to study both nucleotide sugar transporters and glycosyltransferases,” Scheller says.
For their assay, Scheller, Rautengarten, Ebert and their collaborators, created several artificial substrates for nucleotide sugar transporters, then reconstituted the transporters into liposomes for analysis with mass spectrometry. The researchers used this technique to characterize the functions of the six new nucleotide sugar transporters they identified in Arabidopsis, a relative of mustard that serves as a model plant for research in advanced biofuels.
“We found that these six new nucleotide sugar transporters are bispecific, which is a surprise since the two substrates are not very similar from a physical standpoint to the human eye,” Scheller says. “We also found that limiting substrate availability has different effects on different polysaccharide products, which suggests that cell wall polysaccharide biosynthesis in the Golgi apparatus of plants is also regulated by substrate transport mechanisms.”
In addition to these six nucleotide sugar transporters, the assay was used to characterize the functions of 20 other transporters, the details of which will soon be published.
“Thanks largely to the efforts these past two years of Carsten Rautengarten and Berit Ebert, we now know the activity of three times more nucleotide sugar transporters than are known in humans, and we have determined the function of two-thirds of the plant transporters as compared to one-quarter of the human ones,” Scheller says. “This is a tremendous accomplishment and we are already using this information at JBEI to improve biomass sugar composition for biofuel production.”
Story Source: The above story is based on materials provided by DOE/Lawrence Berkeley National Laboratory. The original article was written by Lynn Yarris. Note: Materials may be edited for content and length.
Journal Reference: C. Rautengarten, B. Ebert, I. Moreno, H. Temple, T. Herter, B. Link, D. Donas-Cofre, A. Moreno, S. Saez-Aguayo, F. Blanco, J. C. Mortimer, A. Schultink, W.-D. Reiter, P. Dupree, M. Pauly, J. L. Heazlewood, H. V. Scheller, A. Orellana. The Golgi localized bifunctional UDP-rhamnose/UDP-galactose transporter family of Arabidopsis. Proceedings of the National Academy of Sciences, 2014; DOI: 10.1073/pnas.1406073111
Cite This Page: MLA APA Chicago: DOE/Lawrence Berkeley National Laboratory. “How sweet it is: Bioenergy advanced by new tool.” ScienceDaily. ScienceDaily, 28 July 2014.
(From left) Berit Ebert, Carsten Rautengarten and Henrik Scheller at JBEI have developed an assay for characterizing the functions of nucleotide sugar transporters in plant cell walls. (Photo by Irina Silva, JBEI)
from-dust-of-stars: Flipping sweet indeed, towards better ‘genetic engineering of ‘fuel’ crops for clean, green and renewable bioenergy.’ Very exciting for prospects of bioenergy.
Cable Companies: Google Threatens Net Neutrality, Not Us
Internet service providers are urging the FCC to regulate popular websites
By Brendan Sasso. July 25, 2014. National Journal
The real threat to online freedom is from Internet giants like Google and Netflix, according to major cable companies.
Those sites could block access to popular content and extort tolls out of Internet service providers, the cable companies warn.
The argument is the backward version of the usual fight over net neutrality.
There is intense public pressure on the Federal Communications Commission to enact net-neutrality regulations that prevent broadband providers from blocking websites or manipulating Internet traffic. Consumer advocacy groups and the major Internet companies warn that because broadband providers like Comcast control their customers’ access to the entire Internet, they have tremendous power to distort the Internet for their own purposes.
But in a filing to the FCC, Time Warner Cable claimed that the controversy over Internet providers potentially charging websites for access to special “fast lanes” is a “red herring.” The real danger, the cable company claimed, is that Google or Netflix could demand payments from Internet providers. Customers expect access to the most popular websites, and an Internet provider may have little choice but to pay up.
The National Cable and Telecommunications Association, which represents all the major cable companies, wrote that “a relatively concentrated group of large [Web companies]—such as Google, Netflix, Microsoft, Apple, Amazon, and Facebook—have enormous and growing power over consumers’ ability to access the content of their choice on the Internet.”
The group argued that Google, which handles about 68 percent of all Internet searches, has far more control over access to other sites than any individual broadband provider does.
"It makes no sense to focus exclusively on Internet access providers and ignore conduct by [websites] that threatens similar harms," the cable lobbying group wrote.
The threat of being charged for access to websites is a particular focus for the American Cable Association, which represents small cable companies. In its filing, the group warned the FCC that “leaving other Internet actors free to block or discriminate” would “undermine the rules’ goals and effectiveness.”
It’s not that crazy an idea that websites might charge broadband providers for access to their content. After all, cable companies pay for the right to carry TV channels.
But during Netflix’s quarterly earnings call this week, CEO Reed Hastings dismissed the idea of demanding money for the “privilege” of carrying Netflix data.
"I think the Internet really has this different, much more open, architecture than classic cable," Hastings said. "What you get is this open, vibrant system that the Internet has been so famous for. And that’s really the tradition that we grew up in, and that we’re trying to see carry forward."
It’s unlikely the FCC would extend its net-neutrality regulations to websites like Netflix. In its proposal, the FCC said that while “other forms of discrimination in the Internet ecosystem may exist … such conduct is beyond the scope of this proceeding.”
The argument comes at a time when Internet regulation is up for grabs: A federal court struck down the FCC’s old net-neutrality rules earlier this year, and FCC Chairman Tom Wheeler is now trying to rework the rules in a way that can survive future court challenges. His proposal has sparked a massive backlash because it would allow broadband providers to charge websites for faster service as long as the agreements are “commercially reasonable.”
WHY YOU SHOULD CARE ABOUT NET NEUTRALITY
What exactly is “net neutrality” and why does the FCC want to regulate the Internet?
Michael Weinberg, vice president of the consumer advocacy group Public Knowledge, said net neutrality is really about preventing Internet service providers from abusing their power as “gatekeepers” of all Internet content. Netflix might be a gatekeeper for access to House of Cards, but that’s an entirely different problem, he said.
But if the FCC allows net neutrality to die, it’s possible that the Internet could begin to resemble cable TV, Weinberg said.
"The Netflixes and Googles of the world may have to pay to get on Comcast, Verizon, and AT&T," he predicted. "One way that they might make up that money is to charge small or rural ISPs high rates to access Google and Netflix."
But, for now, the FCC should focus on enacting strong net-neutrality rules that prevent abuses by Internet service providers, Weinberg said.
There have been examples of websites blocking access to content for subscribers of particular Internet providers. But the culprits weren’t Google or Netflix—they were media companies that pulled online videos as part of contract disputes with cable providers.
Last year, for example, CBS blocked its online videos for Time Warner Cable subscribers after the companies were unable to reach an agreement on carrying CBS TV stations. The tactic ensured that customers couldn’t just watch their favorite shows online for free while the actual channel was blacked out.
Weinberg said the online blackouts are troubling, but they are a symptom of a “broken” TV regulatory regime, not evidence that net-neutrality regulations should cover websites.
This article appears in the July 25, 2014 edition of NJ Daily.
universal-abyss: Excellent article on a very convoluted debate that affects us all. A must read - it’s really worth it.