Monthly Archives: January 2012

Supreme Court nixes warretless tracking

The argument was that since police could follow someone, and that because of that nobody had a right to an expectation of privacy as to their whereabouts in public, it was ergo fair game to surreptitiously place tracking devices on anyone’s car they wanted to monitor. It was just a matter of good use of resources. Why devote police officers to trail suspects, when technology could do the same thing cheaper and more efficiently! It was “reasonable” and police could use it at their discretion without judicial oversight.

This article in WSJ gives the “not so fast” from the Supreme Court in a unanimous ruling, that was split 5-4 along lines of reasoning. The majority opinion, authored by Antonin Scalia (backed by CJ Roberts together with Kennedy, Thomas, and Sotomayor agreed that the 4th amendments protection of “person’s, houses, papers, and effects, against unreasonable searches and seizures” would logically extend to private property such as automobiles and the use of GPS trackers constituted a “search” – since the governments case was based on the claim that it was not, the Government case was forfeit, and a conviction based on such a “search” was thrown out. They said that was enough reasoning to suit them. Interestingly, Justice Alito, supported by Justices Breyer, Kagan and Ginsberg, added to this in a concurring opinion. They felt that a property-based argument alone was too narrow to guard against threats to personal privacy from other modern technology.

Given the increasing use of things like cell phone data extractors and Wi-Fi collectors, the more liberal justices seemed to want to serve notice that they will not look kindly on arguments (which they are likely to see) equating the Information highway to real highways in this regard, with similar arguments as to the expectation of privacy. The “expectation of privacy” protects the person, while unreasonable search protects property. Both are to be protected. The Alito argument serves notice too those preparing possible challenges to warrantless police snooping on Wi-Fi or data collection from cell phones and standoff thermal imaging of interiors of residences better have better arguments than that citizens have “no expectation of privacy” over things that are snoopable, thus giving government free rein to do such snooping at its pleasure.

Might future ships run on…seaweed??

This paper in the journel SCIENCE reports on discovery of a process that uses a genetically engineered e coli bateria to metaolize sugars in seaweed to achieve a .281 weight ethanol/weight of dry seaweed (macroalgae).

The question is, can sufficient seaweed be harvested to make a dent in fuel prodution. Might not solve the energy challenge writ large, but seaweed aquaculture off of major ports could provide supplemental fuel for Navies. the question would become how large an aquaculture enterprise would environmental concerns allow?

The ‘Person of Interest’ Machine – The Air Force wants it!

If you are fan of spy-ish adventure show, you have probably given CBS “Person of Interest” a gander. If not, its decent to pretty good, depending on the episode. On my DVR list. The premise is that a reclusive Billionarie tech wizard known only as “Finch” (Played by Lost’s Benjimin Linus – actor Micheal Emerson)buidls a ‘machine’ for the government that integrates all forms of surveillence across the country with breakthrough artificial intelligence to identify “Persons of Interest” – people who are highly probable to be perpetrators or victims based on their activities. Disturbed that the government was going to only act on output from “the Machine” that had national security implications, our genius put a back door in teh machine that sent him the social security numbers of the highest liklihood “persons of interest”.

Enlisting former CIA operative Reese (the show likes singular names…played by Jim Caveizal) as his invetigator/action officer, each episode follows the formula of getting a name, figuring out if they are a good guy or bad guy, and either protecting them or bringing them to justice (sometimes “with extreme prejudice”…).

The lead ins from each commecial break show “the machine eye view” of the world with reticles tacking faces and text of conversations, texts and if close enough lip reading on the screen. This somewhat dystopian segue adds an interesting context to plot that hashes out.

Enter the Air Force. I’m not sure what it is about the Air Force culture, but they have a penchant for thinking technology can cure every ill. Suffering a rather bloddy rebuke from a Marine General at JFCOM regarding their darling “Effects-based warfare” a few years back, they seem to believe that the problem of not being able to know what it is the bad guys are going to do means they just have to create a technology that can tell them beforehand what the bad guys are going to do. Bouyed by snake-oil salesman touting “predictive intelligence”, the Air Force has dabbled for years with notions of how such a capability would return their Newtonian Effects-based warfare ideas to primacy in a world where “what happens next” is stuck inside peoples heads.

This piece in Wired’s Danger Room discusses the desire of the chief Scientist of the Air Force to create Finch’s Machine. At least in part. calling it “Social Radar” he talks about it “seeing into the hearts and miinds of people”.

Does he really mean it? Is he serioously proposing a mind-reading machine? Appears so. “Don’t just give me a weather forecast, Air Force, give me an enemy movement forecast.’ What’s that about? That’s human behavior. And so [we need to] understand what motivates individuals, how they behave.”

And if you question if the vision really rivals the scale and pervasiveness of Finch’s Person of Interest Machine. Dr. Maybury describes his “Machine” thus:

Using biometrics, Social Radar will identify individuals, Maybury noted in his original 2010 paper on the topic for the government-funded MITRE Corporation. Using sociometrics, it will pinpoint groups. Facebook timelines, political polls, spy drone feeds, relief workers’ reports, and infectious disease alerts should all pour into the Social Radar, Maybury writes, helping the system keep tabs on everything from carbon monoxide levels to literacy rates to consumer prices. And “just as radar needs to overcome interference, camouflage, spoofing and other occlusion, so too Social Radar needs to overcome denied access, censorship, and deception,” he writes.

The paper opines that radar “provided a superhuman ability to see objects at a distance through the air., and connects the dots to the need for a new superhuman capability:

“Accordingly, a social radar needs to be not only sensitive to private and public cognitions and the amplifying effect of human emotions but also sensitive to cultural values as they can drive or shape behavior.”

Goig further:

For example, radar or sonar enable some degree of forecasting by tracking spatial and temporal patterns (e.g. they track and display how military objects or weather phenomena move in what clusters, in which direction(s) and at what speed.) A user can thus project where and when objects will be in the future. Similarly, a social radar should enable us to forecast who will cluster with whom in a network, where, and when in what kinds of relationships.

You can read the entire paper here

Of course, I’m sure it will filter out such information about U.S. citizens, and will only act on it in dire circumstances of national security. That dad gum Constitution and all..

We can only hope that the rest of the info is only available the altruistic likes of Finch and Reese…

The day after the SOPA blackouts…what did it accomplish?

Several itmes: Time Magazine the Seattle Times and ABC News have interesting takes on the anti-SPA blackout. Time reports that before the the blackout 5 Senators were on record against the bill, and now one source it is now 19 with 7 more “leaning no” in the Senate. To find key remaining supporters…follow the money: of the 19 Senators that received over 75K from Hollywood and the Music Industry, 13 are supporters, and only 1 Roy Blunt (R-MO), is on the record firmly against.

Interestingly, Silicon Valley money does not seem to carry the same weight, as the 16 Senators with over 75K in Silicon valley money, only 2 are firm “No’s”, though 7 in that category saw Hollywood outspending Silicon Valley, with a “yes” resulting…outspending your competition seems to work 😉 The group “Fight for the Future” claims the Senate ‘No’s’ have increased to 35 no, but that seems to be a single, biased, source. Total donations to the Senate by Hollywood/Big Music were 5.6M$ (Big winner Barbera Boxer (D-CA with a whopping 571K$) while Silicon Valley gave 4.2M$ (Big winner Patty Murray (D-Micro$oft…errrrr WA) with 363K$ – note Barbera Boxer was 2nd with 348k$)

The House is far more hostile, with only 27 on the record for, and now 83 opposing. Of the 15 House Members with over 75K in Hollywood and Music Industry Money, only 6 supporters remain. Only 7 received over 75K from silicon valley, 5 are nos, and none of those 7 were outspent by Hollywood/Music interests. 7.9M$ was donated to House members by Hollywood/Big Music (Big winner Howard Berman (D-CA) with 286K), 6.5$ by Silicon Valley (Big winner Anna Eshoo (D-CA) with 163K).

The Administration has weighed in threatened to oppose legislation that contains language that would make it easier for the government to censor the web or make the internet less secure, but not saying it considered either SOPA or PIP contained such language. Opponents say it’s obviously implied, but there is a ton of re-election money at stake that Obama does not want to put at risk with a clear answer.

The opposition seems to be remarkably non-partisan with groups ranging from the expected host of conservative and libertarian groups to Moveon.org taking their sites down in protest. Google had

The “rest of the story” however, is where the interesting innovation is taking place.

The neatest thing that has come out of all this is that the rival legislation to SOPA/PIP – the OPEN act – has been used to debut an alternative to the library of Congress’ legislation publication site “Thomas”, dubbed informally by its users “Madison” here. Rather than Thomas’ bare bones search engine that often returns a confusing array of bill versions, both with and without pending amendments, Madison employs real-time mark-up as proposals for changes and amendments are passed – INCLUDING the ability to show those recommended by the public. – It’s a Thomas meets Wikipedia and Twitter experiment in participatory democracy. The site tracks amendments proposed, passed, and failed along with video clips of proceedings. It has a LOT of room for improvement, tracability and filtering of comments, more comprehensive video archives cross-indexed to amendments. Compared to the sterile and confusing “if you are not a poliwonk have fun slogging” that is Thomas…(my search on ‘SOPA’ – resulted in “no items found” – you need the HR/SR number just to get started…) it is a potential “game-changer” for how individuals can observe and if desired PARTICIPATE in the sausage-making of bill -crafting.

Is opening the door in this way fraught with peril? Are we not after all a Republic, and not a direct Democracy for good reasons? Can’t the Hollywood and Music shills exploit this to get language they want into bills? Yes. But the current system has that happen behind closed doors enabled by access largely granted by campaign donations. In my mind, anything that moves that out into the light of day, where anybody can see it, is better.

Computer memory advances – Moore’s Law safe for at least another 2 decades

ON the heels of last months Racetrack memory development, IBM announced that it achieved a proof of principle test of anti-ferromagnetic theory that allowed the reliable encoding of a bit of information using 12 atoms. Typical magnetoelectronic drives require on the order of 1 million atoms per bit encoded.

2008 saw graphene matrix technology achieve milestones that could see 3D data arrays with extremely low “leakage” rates (‘offs’ turning ‘on’ by proximity) and very low maintenance power.

2007 saw phase change memory take off as the follow-on memory to NAND (Flash-drive) and possibly DRAM (the current memory tech in PC RAM chips).

The rapid advances on these many fronts has disrupted some plans for industry wide memory standards, with companies hedging investments in near term advances like PRAM, with mid term investment in graphene and now long term R&D into anti-ferromagnetics. This could lead back to ‘bad old days” when companies had individual, proprietary memory solutions. This will likely shake out into a well-ordered progression over the next 15-20 years, but as long term alternatives present themselves, companies will retreat from going “all-in” on near term solutions, likely delaying their introduction.

The good news is that reports of 10nm being the limit of Moore’s law are being smashed, with a tech pathway for short term “universal memory” – technology that eliminates the need for separate “active memory” (ie RAM) and “storage memory” (ie hard drives) and long term “distributed computing” where CPUs and memory on a fast network are all available as needed to support processor tasks and memory allocation tasks.

5 Top Tech Issues to watch in 2012

This article at CIO magazine talks about the five major tech issues for the next year. These are:

Smartphone spectrum – The 2010s are the decade of the smartphone, with tablets, “talking (and listening) cars”, and other “internet appliances” exploding on the scene. So where does the spectrum come from to support this huge demand? One idea is to offer current spectrum licence holders (primarily TV stations) a cut of the profits if they sell. The worry is that current legislation may allow “eminent domain” domain type abilities to government to force compliance, if not enough spectrum is voluntarily offered. There is also concern about the implementation and what exactly constitutes “licensed spectrum”. The light squared vs GPS debate shows that technical implementation issues can have dramatic “good neighbor” effects on adjacent spectrum, potentially rendering GPS receivers unusable in the vicinity of light squared transmitters. In a similar manner it is possible 4G and higher equipped towers, may cause interference with over the air broadcast HD TV and radio. As anybody who has dealt with RFI issues on ships knows, just because you have transmitters and receivers operating on different parts of the spectrum, you can still have debilitating interference problems that are difficult or impossible to eliminate. Given the demand for fast, mobile internet access, there looks to be a high probability growing pains that will see ‘orphaned’ uses like broadcast HD TV and an acceptance of geographic or temporal “holes” in services such GPS and legacy cell services in order to satisfy the ravenous demand for fast, mobile internet devices. From an innovation standpoint, what is the relative tradeoff between going “all in” on new ways to delivery content and services and respecting the investment nearly everyone has in the hodgepodge of existing delivery mechanisms.

SOPA/PIP – The downside of the shift from media delivery by physical means to electronic means is the perceived ease with which creative property can be reproduced and distributed without compensation to the rights holder. I’ve discussed this before. It is a problem. The scope of the problem is only a fraction of what some like the RIAA claim it to be, but it is a problem. The problem is that what appears to be a straightforward case of addressing “foreign exploitation” but the devil is in the details. With the rise of cloud computing jut what is a “foreign website”, well the RIAA and its allies want to treat the definition of “foreign” as “anything that is not explicitly domestic” – something that is increasingly impossible to determine with data centers around the globe exchanging website data as required to service demand from an increasingly networked world. By taking advantage of the fact that the internet is increasingly divorced from terrestrial geography and national borders, the RIAA and allies are trying to exploit a seemingly reasonable argument that we can treat “foreign pirates” differently from domestic ones, the result would allow them to treat any website that employs cloud resourcing as “not explicitly domestic, and therefore foreign”. This sets up a confrontation between IP producers, trying to protect their IP, and resource distributors (like Google, Yahoo, even Microsoft) who the IP producers want to hold hostage in whole, if they can find any little sub-component in violation. The resulting chilling effect on resource provider/distributer borders on extortion, with the content providers licking their chops over a potential mountain of harassing litigation that would dwarf the likes of Righthaven’s efforts. The other issue is the effect “militarizing” the current DNS architecture would have could lead the real bad guys from taking advantage of security issues in the DNS naming apparatus to create in effect a “black internet” free of any regulation whatsoever. Groups like anonymous could also wreak havoc within the current DNS system that is based on voluntary compliance within a largely self-regulated system. Right now there is no incentive to “go outside the system” or “attack the system”. Force people out of the system and odds are they won’t just go “oh, darn ya got me” but will either operate outside the system, or seek retribution with the system.

Kudus to the Obama administration for listening to all sides of the argument and voicing “serious concerns” and pushing back against the current legislative thinking.

Consumer Privacy – Recent revelations that iPhones and many other smartphones spy on their owners geographic location and read their texts and email for hints on what our interests are have a lot of people evaluating how “helpful” this is. Particularly in light of attempts by law enforcement to consider downloading the entire contents of a non-password protected smart phone (and with the thinnest probable cause demand access to a protected one) putting this data in the hands of law enforcement. Attempts to restrict this type of warrantless search in California were recently vetoed by Gov Brown. In like fashion, national legislative efforts have died because of “national security implications”. Enter the European Union, where privacy still trumps security concerns, and a a host of national laws require a comprehensive Euro-zone wide solution. It is odd that the EU bureaucratic system may set the standard for privacy policy, which the US would have to subsume, at least in principle, for US companies to engage in EU commerce. Such requirements to protect your data from unscrupulous businesses and “apps” originating in the EU (including default use of strong encryption) could well be what protects your mobile data from the prying eyes of over-zealous US law enforcement. Unlike the Patriot Act where privacy concerns are mostly theoretical in nature, there are already an increasing number of cases where warrantless cell phone data extractions have lead to convictions, with some states like California and Michigan being particularly aggressive.

Net Neutrality – The Second Front in the war on piracy, the issue of internet providers limiting the ability of customers to download what they want and media content providers ability to offer high density content is still looming large. Its predominantly the later, but the former has been cast as an issue of “high bandwidth users are obviously pirates”. The rise of netflix in particular has shifted this from a user-moderation issue to a supplier compensation one. The user aspect of this is the right to download what you want without throttling or capping of access. The provider side is notion that providers of high density content like netflix should be taxed because of the bandwidth requirements of supporting their business model. The internet providers want their cake and eat it too, getting increased revenue from high density content providers, while they extract more fees from high bandwidth users. The “real” net neutrality advocates say neither is appropriate, and if an internet provider collects fees based on offering a service to provide “access” that works both ways and means users should get the access promised without throttling and capping, while service providers should not have to pay a premium based on the density of their product. The reality is that service providers have to give something, and internet suppliers have to give something. the market should be allowed to sort this out, not having aa solution forces on them by legislation. The problem is that many places (like Aquidneck Island) have an effective monopoly (You will never see FIOS on AQUIDNECK ISLAND, ITS JUST NOT PROFITABLE – SO WE ARE HOSTAGE TO COX…) has only a single internet service provider, and providers enjoy a monopoly that lets them try to make money on both the consumer and content provider ends. Net neutrality in its pure sense prevents monopolies from holding both ends hostage and extorting profit without competition.

Cyber-security – This brings all of the above into alignment as hackers have shown how vulnerable mid-level and “should know better” players on-line are to hackers. The good news here is that major players have not suffered serious problems. Google, Amazon, and major retailers, like the banking industry, are moderately secured against no-national level hackers. The issue is, against those high-end threats, just how vulnerable are the high-end players? The respective hands are yet to be played out, but the indications are not encouraging. The question is, since the financial risk is on the commercial sector, should they be relied on to keep us safe, or should the government step in and take the lead? If the government takes the lead, should it be in the role of supported, or supporting “commander”…

Test Your scientific literacy!

This quiz at the Christian Science Montior covers a wide range of topic. Be careful I jumped at couple answers without fully RTFQ and got 3 wrong… But I knew them 😉

Astronomers map cosmic Dark Matter

Fermilab reports results of a recent effort to map the the visible distribution of Dark Matter. Cosmogony (universe origin) theories require non-interactive matter to exist in order for the “normal” matter to coalese into galaxy clusters. This new map confirms simulations based on the Friedmann–Lemaître–Robertson–Walker Standard Model of cosmology (popularly known as “Big Bang theory” – actually a specific solution to Einsteins general relativity field theory equations.

This new map adds considerable weight to models that will allow more extensive maps of dark matter to be made from terrestrial telescopes. Understanding the distribution and forms of dark matter will assistin figuring out just what it is composed of, and increase our knowledge of the origin of the universe.

“Artificial Trees” could scrub C02 from air, then make methane for fuel..

This Science Now article describes a possible system for removing C02 from the atmosphere, combine it with electrolysis of Hydrogen from water, and make methanol fuel in a sustainable cycle. all based on cheap materials that can be produced in industrial quantity. Ideas like should be sold on the basis of being possible sustainable sources of energy. Its also refreshing to see an article about a technology like this treat like a really good idea on its merits, and not invoking climapocaplytic fear mongering…

Researchers in California have produced a cheap plastic capable of removing large amounts of carbon dioxide (CO2) from the air. Down the road, the new material could enable the development of large-scale batteries and even form the basis of “artificial trees” that lower atmospheric concentrations of CO2 in an effort to stave off catastrophic climate change.

These long-term goals attracted the researchers, led by George Olah, a chemist at the University of Southern California (USC) in Los Angeles. Olah, who won the 1994 Nobel Prize in chemistry, has long envisioned future society relying primarily on fuel made from methanol, a simple liquid alcohol. As easily recoverable fossil fuels become scarce in the decades to come, he suggests that society could harvest atmospheric CO2 and combine it with hydrogen stripped from water to generate a methanol fuel for myriad uses.

Odds and probabilities – Super Bowl innumeracy…

This ESPN story demonstrates how not understanding probability and expectation can lead to saying some REALLY dumb things…

The story here, supposedly, is that the Packers, 9-5 favorites with a 32% overall probability estimate of winning the Big Game are “weak enough on defense to make it more likely that another team will prevent them from repeating as champions”. “The story is not how prohibitive of a favorite they are, but the fact that they are being given less than a one-in-three chance…the Patriots and Saints combined have a better chance than the Packers.”

Wow, how many times as the odds been so heavily stacked for single team going in to the playoffs, that the odds for them winning were greater than all the other teams COMBINED. Its not clear what the probabilities were, but there have only been a few times that a team came into the playoffs and “coulnd’t lose” since there has been a 4 round playoff system (1978): The Bears in 85 and the 49ers in 89. The times since then it came close (The Packers in 1997, the Rams in 2001 and the Pats in 2007 – all lost the Super Bowl!)

OK, lets take a look at the numbers here. There are 12 playoff teams, so if all things were equal, each team would have a 1 in 12 chance or a bit over 8% chance of winning.

The odds according to the sourced website are (seed in parentheses)
Packers (1) 32%
Patriots (1) 18%
Saints (3) 15%
Ravens (2) 10%
Steelers (5) 7%
49ers (2) 7%
Giants (4) 4%
Falcons (5) 2%
Lions (6) 1.5%
Texans (3) 1.5%
Bengals (6) 1%
Broncos (4) 1%

So yes the odds are that the Packers will not win the Super Bowl. But based on these probabilities the Packers are nearly 4 times more likely to win than assuming the teams were equal strength. Yes the Patriots and Saints Combined have an equal chance, but that is only because the Packers chances are nealy twice as good as either one individually. Again, if teams are assumed equal, any two teams should be TWICE as likely to win, not “equal” to another team. The Packers chances are nearly equal to to that of other NINE teams combined! To have to roll up the bottom NINE teams to get odds greater than that of the favorite, that is a pretty strong endorsement of that team!

If the last 21 years (since the current playoff systems started) are any indication, top 2 seeds have lost 17 times, and won 15 (though top 2 seeds played each other 9 times). But there has not been a SB in that span that did not include at least 1 top 2 seed.

1 seeds: (9-12) 2009 Saints(W), 2003 Patriots(W), 1999 Rams (W), 1998 Broncos(W), 1996 Packers(W), 1995 Cowboys (W), 1994 49ers(W), 1993 Cowboys(W), 1991 Redskins (W), 2010 Steelers, 2009 Colts, 2007 Patriots, 2006 Bears, 2005 Seahawks, 2004 Eagles, 2002 Raiders, 2001 Rams, 2000 Giants, 1994 Bills, 1991 Bills, 1990 Bills
2 seeds: (6-5) 2008 Steelers(W), 2004 Patriots(W), 2002 Buccaneers(W), 2001 Patriots(W), 1992 Cowboys(W), 1990 Giants(W), 1998 Falcons, 1997 Packers, 1996 Patriots, 1995 Steelers, 1994 Chargers
3 seeds:(1-1) 2006 Colts(W) 2003 Panthers
4 seeds: (2-3) 2000 Ravens(W), 1997 Broncos(W) 2008 Cardinals, 1999 Titans, 1992 Bills
5 seeds: (1-0) 2007 Giants(W)
6 seeds: (2-0) 2010 Packers(W), 2005 Steelers(W),

So yes, the odds are the Packers will not be Super Bowl champs, but their odds are FAR better than any other team. However, this has happened at least 5 times in the past and the “team that couldn’t lose” has lost 3 of 5 times… So I would not be too quick to blame the packer D for only a “1/3 chance of winning”. I will refrain from saying any more about the Packer chances – good or bad, until this years playoffs play out. I’ll take a 1/3 chance hope it comes through!