Monthly Archives: February 2012
One of Einsteins statements taken as fact, is that the fundamental physical constants of the universe, most notably the speed of light, are constant throughout the universe. Another thought to be constant was “alpha” the ‘fine structure constant’ that indicates how light and mater interact. The New Scientist reports that there are indications that alpha may not be constant throughout space.
Recent musings about parallel universes has led to speculation that the “fine tuning” of fundamental physical constants was a result of there being infinite parallel universes with differing constants. We are in the “goldilocks” universe not because it was formed for us, but because we could form in it. Now it seems a new wrinkle has been thrown in as some of these “constants may not be constant, but vary throughout space.
In the case of alpha, it appears that there is a gradient across the universe that gives an orientation of higher and lower values. Alpha is one constant that can be measured at a distance. Things like the speed of light may be impossible to measure remotely.
There is still a lot of double checking to be done, as with the hyper-luminal neutrinos (Einstein also famously declared the speed of light to be an absolute speed limit in the universe), but if proven out, could mean there is another layer of cosmological dynamics out there.
ars Technica reports on on a case where it appears that a legitimate web business was taken down because of what one or more customers was doing.
Details remain sketchy, but the fact that the site was allowed to reopen on the .net domain after its .com presence was summarily executed would seem to indicate it was not the site or its owners in the cross hairs. Much to the chagrin of the company owners and over 400,000 customers, many of whom assume that the site owners were criminals if the Secret Service shuttered the site.
So there must have been some good evidence presented to a grand jury to get a multi-million-dollar internet business closed and a banner displayed in its stead that says “NS1.SUSPENDED-FOR.SPAM-AND-ABUSE.COM” without even notifying the owners, right?
All it took was a request to GoDaddy.com, the domain provider by a prosecutor. JotForm.com has had incidents of phishers using its service to try to harvest personal info, but it had thought that it had cooperated responsibly with authorities. It of course could be that the owner is guilty of crimes the Secret Service is interested in, but if that were the case, why allow the site to go right back on-line with a minor domain change (.org from .com)?
If it was a matter of the site’s customers being the criminals, why not go to the site owners and ask them to cooperate instead of going to the domain provider and closing the site “with prejudice”? How can internet commerce thrive when sites can be taken down and customers lead to believe the site promotes spam and abuse, simply on a prosecutorial request? What are the rights of those trying to do business on the internet? Should those that run sites have their reputations be held hostage by those that use the site?
How do you balance the responsibility of law enforcement to police the internet with those of service providers and clients? Do domain providers “own” the access they provide and can withdraw it upon request? Is there any recourse if someone is portrayed as “spammer” without a judgement by a court?
Tough questions that need legal answers if the internet is going to be a fair and equitable landscape. I doubt Google would be subject to such treatment. But if a company with 400,000 customers can be affected so dramatically without judicial over sight, it will become the domain of those “too big to fail.
PC Mag reports on a home for testing driver-less cars.
iProgrammer has a couple articles about the science behind it whether it can be “proven” that such cars are “uncrashable” as some want to demand in approval legislation. This gets to issues about “why does software have bugs”and the correlary, why does software that is sold, have any bugs at all?
It comes down to doing things in parrallel with feedback between “lines of operation” – this is the recipe for a “complex system” and raises the spectre of “emergent behavior”.
Then there is the Vebber’s correllary to Godwin’s Law (“As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches 1”) “The longer a tech thread goes on the probability of a comparison involving Skynet approaches 1″.
The New Scientist reports on a newly discovered ability of simple yeast to overcome hostile changes in the environment by creating a prion from a “quality control” protein known as Sup35.
Prions were discovered as the infectious agents that cause transmissible spongiform encephalopathy in a variety of mammals, including mad cow diseases and Creutzfeldt-Jakob disease. These diseases affect neural tissues by causing the misfolding of proteins that accumulate and eventually cause nerve cell death. All forms are incurable and universally fatal.
What makes prions unique is their lack of nucleic acids to direct their replication. Rather than being reproductive in the normal biological sense of assembling copies of themselves, prions recreate mechanically by a normal protein and rearranging its structure into the misfolded prion form. As such they are in effect nature’s own “nano-machines”. There is still debate as to what sort of catalyst “loads the target protein into the prion machine” though the indication is that an as yet unidentified “third-party” protein is responsible.
So what does this have to do with evolution?
To date there are two mechanisms by which genetic change that results in adaptation – mutation, a change in the DNA sequence itself, or epigenesis, or “skipping” parts of the genetic code when it is read. Prions now appear to be responsible for a third mechanism related to epigenesis where instead of blocking the reading of part of the sequence, the entire sequence is read in its entirety, allowing coding of proteins that are typically ignored.
The result is that the yeast generates a hotchpotch of brand-new proteins without changing its DNA in any way. Within that mix of new proteins could be some that are crucial for survival.
Susan Lindquist at the Whitehead Institute for Biomedical Research in Cambridge, Massachusetts, first saw this process, which she calls “combinatorial evolution”, in 2004, while studying lab-grown Baker’s yeast (Saccharomyces cerevisiae).
“We’ve been saying this is really cool and a way of producing new traits for years, but other people have said it’s a disease of lab yeast,” she says.
Now she’s proved the sceptics wrong by demonstrating beyond doubt that the same process happens in nature too. She has seen it at work in 255 of 700 natural yeasts she and her colleagues have studied
The result is that yeast that Lindquist grew in hostile oxygen depleted or acidic environments was able to adapt and thrive over time. Unless succeeding generations were exposed to prion destroying chemicals. Without the prions, the colonies of yeast withered and died off.
A host of questions are now being raised about what triggers prion creation, whether changes can become “permanently coded” into the genome (theoretically possible, but not observed yet) and what the implications are for higher forms than fungi.
The idea that a prion could be an “evolution accelerator” makes you look at a bunch of sci-fi plots in a whole new light.
This New Scientist article reports that Southern Company, an Atlanta, GA based utility has been given approval by the NRC to build a pair of reactors at its Vogtle Plant near Waynesboro.
As a side note, the Vogtle Plant complex (currently with 2 operating reactors) is named for Alvin Vogtle, a WWII POW who inspired the character played by Steve McQueen in the Movie The Great Escape.
With this approval, it is expected that the NRC will also approve 4 more reactors in the next few future. SC Electric and gas is pursuing licenses for 2 near Jenkinsville SC and 2 Florida Power and Light has proposals for funding 2 others in Florida.
All three sites are proposing using Westinghouse’s new AP1000 reactors a 1000MWe class gen III+ pressurized water reactor with a passive cooling system that can keep the reactor safe for 72 hours without power.
China was the first to build the AP 1000 with two pairs in commission. INdia is building 6 Gen III+ reactors (though different from the AP1000 design. The emerging economies are where growth in the Gen III+ (large pressurised water reactors with passive safeguards and potential 120 year lifetimes with refueling) market is projected to explode, with 65 reactor projects currently under construction and 52 countries asking the IAEA for help starting nuclear programs. The first tier includes The UAE, Saudi Arabia and Turkey. Egypt was among those IAEA intended to help, but the recent unrest has shifted them to the “motivated but politically unstable” list.
Just under half of the 65 reactors under construction are in China…
This Science News article sounds like an episode of “mythbusters”. College in particular lead most of us to muse whether we got a little extra kick of creative energy with that bit of a buzz on, and now a study claims to confirm that men who were “tipsy” (as we used to say “had a glow going”) scored 50% higher on a word association test (thinking of a “linking word” that connected three given words, e.g. given peach, arm, and tar, giving the response ‘pit’.)
The test group was divided into two groups of 20 who were given the word test and performed comparably. Then, while watching an animated movie, one group was given enough vodka cranbery drink to reach an average of .075 BAC, just below the legal limit of intoxication (.08). They also ate a snack. The other group ate and drank nothing. Doing another test, the tipsy group completed an average of nine problems correctly averaging 11.5 sec to come up with a response, while the sober group only averaged 6 taking 15.2 secs.
I know, it was the snack, not the booze, right.
CNET reports on a Belgian company that used a process that uses a laser to melt additive layers of titanium powder to create a solid object from a CAD drawing. In this particular instance, the object is a human jaw bone, replete with integral screw threads and myriad nooks and crannies to foster supporting muscle and nerve growth. All precisely accurate to 1/33rd of a millimeter.
This technology takes what has been a rapid prototyping technology materials like wax, plastics and resins to finished end products of the highest durability (doesn’t get much more durable than titanium…). One can imagine that this process combined with recent regenerative advances to narrow the divide between prosthetic and “replacement” limbs.
MIT Tech Review reports that terahertz sensing technology is nearing maturity. One of the first applications is in stand-off scanners allowing police to check people for concealed weapons from a distance. Current models have a range of about 15ft, but should be able to be tuned for ranges up to 75ft. The scanners will allow the current “stop and frisk” policy of stopping people on the street for questioning and if they have “reasonable suspicion” – a lessor standard than “probable cause” that can lead to a search warrant – they conduct a pat-down search for weapons.
Law Enforcement advocates taut the tech as a way to protect police who are often assaulted or even shot during such episodes. Privacy advocates cry foul claiming this is a further erosion by remote sensing technology of constitutional protects from “unreasonable searches and seizures”. They argue that remote scanning technology has the potential to make physical searches for many types of items unnecessary, and importantly, conducted without the subjects knowledge. “The Fourth Amendment doesn’t vanish when you leave your house” a privacy advocate maintains.
The tension between “fair use” of information about what you do online and within stores and the spectrum of privacy expectations people have is increasingly going to cause dust-ups in both the real and virtual worlds
This ars technica article talks about a start-up that has developed a technology that removes a music file and its digital rights from one computer or device and transfers it to another. The problem is, that in moving the file from the seller’s computer to its server, and thence to the buyer’s computers involving copying the file, something that Capitol records says is a criminal copyright violation liable for up to 150K$ in fines per file.
ReDigi – the company – claims an exemption under the “essential step” clause in the copyright law that allows copying a computer program if it is an “essential step” in the utilization of the program.
(a) Making of additional copy or adaptation by owner of copy. Notwithstanding the provisions of section 106 [ 17 USC 106 ], it is not an infringement for the owner of a copy of a computer program to make or authorize the making of another copy or adaptation of that computer program provided:
(1) that such a new copy or adaptation is created as an essential step in the utilization of the computer program in conjunction with a machine and that it is used in no other manner.
Capitol records claims that the entire analogy to a ‘used record store” is inapplicable because used record stores do not make copies of records, but resell the physical object. Since a digital music file is not a physical object, but a license to reproduce specific intellectual property – that is not transferable – they say ReDigi’s enterprise is based on illegal theft of their intellectual property. They say the “essential step” defense is also not applicable because a music file is not a “computer program” – it is simply data with no executable code. This is not entirely true because of the digital rights management code embedded in the music file that is executable code, which of course Capitol says is a “container” protecting the file and not an actual part of the file itself.
On the face of it it appears ReDigi has a tough road ahead, but a lot will depend on the extent to which the courts buys into the “digital used record store” analogy and whether a music file is considered a “computer program” or whether adding digital rights management changes that. There has already been push-back in the courts to the idea that you don’t actually “own” a music file, but have simply purchased a non-transferable licence to use the file. Similar arguments regarding computer programs have already failed to sway courts who have allowed the resale of computer software. That has involved the transfer of the physical media the program is distributed on – making it a direct equivalent to the “used record store” and not an “analogy”.
This MIT Tech review article references a study into the use of ‘social-bots’ to create connections between social groups on twitter.
The question this invites is could “socialbots” be employed to foster desired patterns of connection among social groups and discourage undesirable ones? What is the impact of the overtness (or covertness) of such activities? If social connections are made, and mature, what would be impact of the discovery that the introduction starting the connection was done by someone else to further an agenda?
Could intelligent agents of this type be “autonomous cyber vehicles”?