Tracks in the Sand
Comments, Commentary, and field notes from the front line of the evolution of Global Economic Development and Open Technologies Infrastructures
Tuesday, August 03, 2010
Saturday, July 31, 2010
Japan Rising: NEDO's $27 Collaboration with Los Alamos
Michael Kanellos: - March 5, 2010
Coming to America: The Japanese Home?
Smart grid tests in New Mexico could lead to some really cool household appliances.
Japan has some of the most energy efficient homes and office buildings in the world. Now, a consortium of companies and researchers will try to figure out how well some of the underlying technologies play in the U.S.
The New Energy and Industrial Technology Development Organization (NEDO) today formally unfurled a series of smart grid initiatives with national laboratories and utilities in New Mexico. Under the alliance, companies like Hitachi, Panasonic, NTT DoCoMo, Mitsubishi and others will integrate prototype and existing technologies into buildings and grids and then test them out.
In one experiment, for instance, a microgrid will be equipped with a hybrid storage system consisting of sodium sulfur and lead acid batteries while homes on the same grid will be rigged up with solar panels and individual lithium ion battery packs for storing power from the solar array. An office building, meanwhile, will be equipped with solar panels, a large (80-kilowatt) fuel cell and other technologies.
Other experiments will examine and test security and reliability of the grid. New Mexico is a great place for testing these technologies, I was told: the homes will have to endure hail, lightning, high altitude and other environmental extremes.
If all goes well, the four-year experiment will do more than just compile data. It could serve as a proof point for many Japanese companies. Panasonic has been preparing a smart home strategy over the last several years. Check out this video of Panasonic's latest demonstration home in Tokyo complete with a home fuel cell, water-efficient appliances, and automatically dimming LED bulbs.
In 2007, company executives told me that it was largely contemplating delivering these technologies to customers in Japan and maybe Europe. Since then, the market for energy efficiency and smart grid technologies in the U.S. -- along with the incentives to retrofit -- have grown.
At Ceatec outside of Tokyo in late 2009, Sharp showed off a number of home technologies-home charging stations, LED bulbs-that it hopes to sell. Hitachi and others, meanwhile, have demonstrated energy efficient TVs that can serve as portals for home automation and energy management as well as novel washing machines and household appliances. More video here.
To date, perhaps the one thing that Japanese companies have lacked is simply the willingness to aggressively market in the U.S. The conservative tack of many Japanese companies when it comes to marketing their domestic energy efficiency technologies in the U.S. was one of the big topics at a recent clean tech summit sponsored by the Japanese External Trade Organization in San Francisco. Rob Schmitz, the Los Angeles Bureau Chief for KQED, recently visited a modular home builder while on a reporting trip in Japan.
These are fantastic, he told the builder. Are you going to bring these to the U.S.?
"'Hmmm.....Maybe Australia,' he said," Schmitz told me.
New Mexico might help prod that along.
Thursday, June 05, 2008
37 Year Old Elmer Bandit with 20,710 Miles!!
Only yesterday, Missouri owner Mary Anna Wood completed the paperwork for a further nine rides she and Elmer intend entering this season.
Elmer, among the first five horses inducted into the sport's Hall of Fame in 1980, successfully completed a two-day ride at Perry Lake, in Kansas, on the weekend of April 26.
Mary Anna and Elmer completed 33 miles (53km) on the Saturday and 20 miles (32km) on the Sunday.
He passed the veterinary checks without a problem.
Elmer now has 20,300 certified competitive miles (32,670km), and is only about seven rides away from passing the national record of 20,710 miles (33,330km) held by a saddlebred horse, Wing Tempo.
[More ...]
Thursday, November 16, 2006
Looking for the Future in the Past ...
If the experiment works, a signal could be received before it's sent
By TOM PAULSON
P-I REPORTER
If his experiment with splitting photons actually works, says University of Washington physicist John Cramer, the next step will be to test for quantum "retrocausality."
That's science talk for saying he hopes to find evidence of a photon going backward in time.
"It doesn't seem like it should work, but on the other hand, I can't see what would prevent it from working," Cramer said. "If it does work, you could receive the signal 50 microseconds before you send it."
Uh, huh ... what? Wait a minute. What is that supposed to mean?
Roughly put, Cramer is talking about the subatomic equivalent of arriving at the train station before you've left home, of winning the lottery before you've bought the ticket, of graduating from high school before you've been born -- or something like that.
"It probably won't work," he said again carefully, peering through his large glasses as if to determine his audience's mental capacity for digesting the information. Cramer, an accomplished experimental physicist who also writes science fiction, knows this sounds more like a made-for-TV script on the Sci Fi Channel than serious scientific research.
"But even if it doesn't work, we should be able to learn something new about quantum mechanics by trying it," he said. What he and UW colleague Warren Nagourney plan to try soon is an experiment aimed at resolving some niggling contradictions in one of the most fundamental branches of physics known as quantum mechanics, or quantum theory.
"To be honest, I only have a faint understanding of what John's talking about," Nagourney said, smiling. Though claiming to be "just a technician" on this project, Cramer's technician partner previously assisted with the research of Hans Dehmelt, the UW scientist who won the 1989 Nobel Prize in physics.
Quantum theory describes the behavior of matter and energy at the atomic and subatomic levels, a level of reality where most of the more familiar Newtonian laws of physics (why planets spin, airplanes fly and baseballs curve) no longer apply.
advertising
The problem with quantum theory, put simply, is that it's really weird. Findings at the quantum level don't fit well with either Newton's or Einstein's view of reality at the macro level, and attempts to explain quantum behavior often appear inherently contradictory.
"There's a whole zoo of quantum paradoxes out there," Cramer said. "That's part of the reason Einstein hated quantum mechanics."
One of the paradoxes of interest to Cramer is known as "entanglement." It's also known as the Einstein-Podolsky-Rosen paradox, named for the three scientists who described its apparent absurdity as an argument against quantum theory.
Basically, the idea is that interacting, or entangled, subatomic particles such as two photons -- the fundamental units of light -- can affect each other no matter how far apart in time or space.
"If you do a measurement on one, it has an immediate effect on the other even if they are separated by light years across the universe," Cramer said. If one of the entangled photon's trajectory tilts up, the other one, no matter how distant, will tilt down to compensate.
Einstein ridiculed the idea as "spooky action at a distance." Quantum mechanics must be wrong, the father of relativity contended, because that behavior requires some kind of "signal" passing between the two particles at a speed faster than light.
This is where going backward in time comes in. If the entanglement happens (and the experimental evidence, at this point, says it does), Cramer contends it implies retrocausality. Instead of cause and effect, the effect comes before the cause. The simplest, least paradoxical explanation for that, he says, is that some kind of signal or communication occurs between the two photons in reverse time.
It's all incredibly counterintuitive, Cramer acknowledged.
But standard theoretical attempts to deal with entanglement have become a bit tortured, he said. As evidence supporting quantum theory has grown, theorists have tried to reconcile the paradox of entanglement by basically explaining away the possibility of the two particles somehow communicating.
"The general conclusion has been that there isn't really any signaling between the two locations," he said. But Cramer said there is reason to question the common wisdom.
Cramer's approach to explaining entanglement is based on the proposition that particles at the quantum level can interact using signals that go both forward and backward in time. It has not been the most widely accepted idea.
But new findings, especially a recent "entangled photon" experiment at the University of Innsbruck, Austria, testing conservation of momentum in photons, has provided Cramer with what he believes is reason for challenging what had been an untestable, standard assumption of quantum mechanics.
The UW physicists plan to modify the Austrians' experiment to see if they can demonstrate communication between two entangled photons. At the quantum level, photons exist as both particles and waves. Which form they take is determined by how they are measured.
"We're going to shoot an ultraviolet laser into a (special type of) crystal, and out will come two lower-energy photons that are entangled," Cramer said.
For the first phase of the experiment, to be started early next year , they will look for evidence of signaling between the entangled photons. Finding that would, by itself, represent a stunning achievement. Ultimately, the UW scientists hope to test for retrocausality -- evidence of a signal sent between photons backward in time.
In that final phase, one of the entangled photons will be sent through a slit screen to a detector that will register it as either a particle or a wave -- because, again, the photon can be either. The other photon will be sent toward two 10-kilometer (6.2-mile) spools of fiber optic cables before emerging to hit a movable detector, he said.
Adjusting the position of the detector that captures the second photon (the one sent through the cables) determines whether it is detected as a particle or a wave.
The trip through the optical cables also will delay the second photon relative to the first one by 50 microseconds, Cramer said.
Here's where it gets weird.
Because these two photons are entangled, the act of detecting the second as either a wave or a particle should simultaneously force the other photon to also change into either a wave or a particle. But that would have to happen to the first photon before it hits its detector -- which it will hit 50 microseconds before the second photon is detected.
That is what quantum mechanics predicts should happen. And if it does, signaling would have gone backward in time relative to the first photon.
"There's no obvious explanation why this won't work," Cramer said. But he didn't consider testing this experimentally, he said, until he proposed it in June at a meeting sponsored by the American Association for the Advancement of Science.
"I thought it would get shot down, but people got excited by it," Cramer said. "People tell me it can't work, but nobody seems to be able to explain why it won't."
If the UW experiment succeeds at demonstrating faster-than-light communication and reverse causation, the implications are enormous. Besides altering our concept of time, the signaling finding alone would almost certainly revolutionize communication technologies.
"A NASA engineer on Earth could put on goggles and steer a Mars rover in real time," said Cramer, offering one example.
Even if this does fail miserably, providing no insights, Cramer said the experience could still be valuable. As the author of two science-fiction novels, "Twistor" and "Einstein's Bridge," and as a columnist for the sci-fi magazine Analog, the UW physicist enjoys sharing his speculations about the nature of reality with the public.
"I want people to know what it's like to do science, what makes it so exciting," he said. "If this experiment fails in reality, maybe I'll write a book in which it works."
Tuesday, November 14, 2006
Intel Announces Web 2.0 Suite
November 8, 2006
suitetwo.com
Intel has partnered with a group of Web 2.0 companies to form a suite of Web software that will help small and medium businesses share information, the company announced Tuesday.
Intel and its partners made the announcement at the annual Web 2.0 conference in San Francisco, which brings together all the players in the movement popularly known as “Web 2.0.” This is the second wave in the Internet phenomenon which allow consumers to create their own dynamic web pages, collaborate with others, and share photos and videos.
Known as Suite Two, it is a combination of interconnected services from different Web 2.0 companies that aims to improve productivity within small and medium-sized businesses. Intel wants to bring consumer technologies to businesses, the company said.
“The idea is to develop a suite which includes capabilities like blogging, wikis, and RSS,” said Lisa Lampert, managing director of Intel Capital’s Software and Solutions Group. Intel Capital is the venture arm of the world’s largest chipmaker and has investments in several companies that are enabling the Web 2.0 phenomenon such as SixApart, SpikeSource, and Zend Technologies.
SixApart and SpikeSource will form a large part of the Suite Two offering, other than SocialText, NewsGator, and SimpleFeed.
SpikeSource will provide the integration platform for Suite Two which will bring together infrastructure and applications together. SixApart is a provider of blogging software and Socialtext sells software that helps companies collaborate through wikis—or private web pages that are editable instead of sending emails and attachments back and forth.
NewsGator and SimpleFeed will be providing RSS capabilities—for Really Simple Syndication—to access external news and industry information.
This suite will operate on PC-based hardware and all the services will be integrated with a single sign-on and a rich user interface seen on most consumer web sites. The suite will be released in early 2007 and available through Intel’s global channel of OEMs, distributors, and resellers. It will run on Red Hat Enterprise Linux, SuSe Linux, and Microsoft Windows operating systems, the company said.
Intel is investing in these companies and solutions to ensure that these next-generation technologies work well on Intel-based hardware, Ms. Lampert said.
Intel and its partners are banking on the fact that small and medium businesses do not have IT departments and need external, plug-and-play solutions to run their businesses while also adopting new technologies, said SpikeSource CEO Kim Polese.
The price structure is relatively lower at $175 to $200 per user per year to drive adoption within SMBs. Socialtext CEO Ross Mayfield said these technologies are being used in large corporations and SMBs should benefit from them, too.
“You have projections such as 50 percent of companies will be using wikis by 2008 and 80 percent of them will be using blogs,” Mr. Mayfield said. “This is particularly driven by how popular Web 2.0 tools are on the consumer side; you are seeing some grassroots demand and for organizations [like ours] to fulfill the demand.”
Intel’s Ms. Lampert said the above-mentioned features will be available in the first release and forthcoming releases will add more advanced features such as social networking, podcasting, and mobility. Intel is working with leaders in each of these new categories to partner for the Suite Two offering, she said
Monday, October 30, 2006
Sahana: Open Source Disaster Management
MD and Commander in the U.S. Navy Medical Corps --
Rasmussen pointed to cooperation among civil and military teams, and among tech industry rivals Google, IBM and Microsoft in the development of Sahana, a free, open source disaster management system.
Strong Angel III: Lessons in disaster response from Strong Angel III
1) it provides a x-sectional glimpse at the current state of the market.
2) it gives a look into the needs of specific classes of responders
3) A demonstrable collection of service level applications (black boxes)
Angel-III: http://blogs.zdnet.com/BTL/?p=3485
Thursday, August 24, 2006
Governments' role in open source
Digg This!
The decision by Croatia to endorse open source leads to this question. (I assume Hrvatska is Croatia in Croatian.)
What should government's role be in open source?
I know what many of you will say. None. Government's role in a capitalist society is to promote private enterprise, period.
Maybe. But to me that's a theological argument, one totally at odds with American history. From canals to railroads to highways to the Internet itself, the U.S. government has always funded public works, and endorsed policies aimed at improving conditions for certain industries over others.
Governments outside the U.S. are increasingly adopting policies that support open source. They have several reasons. They want to save money. But they also want to encourage local developers, and starting from an open source base means they start from a higher level of complexity than if they were building from scratch.
There is nothing Americans can do to change this. The open source genie is out of the bottle. The Internet is out of its Pandora Box. (Insert your own metaphor here.)
The problem is that, in the past, when American industries were threatened by foreign competition, America generally chose the worst possible policies to support them. Mainly we subsidized incumbents, the outfits with the biggest lobbying arms, rather than doing all we could to encourage entrepreneurship.
Now American software leadership is under threat, with a business model Americans had a big hand in creating. The first state to back open source, Massachusetts, is now backing away from that commitment, in favor of the proprietary model.
So what happens now?
Standardized Medical Records
Though it pushes federal agencies to use standards-based IT, it won't drive many reluctant doctors to adopt such software.
By Marianne Kolbasuk McGee
InformationWeek
Aug 22, 2006 06:00 PM
In the latest U.S. government move aimed at pushing the nationwide adoption of health IT, President Bush Tuesday signed an executive order requiring federal agencies that sponsor or administer health programs, including the U.S. Department of Health and Human Services, to adopt and use interoperable, standards-based health IT systems such as electronic medical records.
In addition to HHS, other large federal agencies that sponsor or administer health programs include the Department of Defense, Veterans Affairs, and the Office of Personnel Management.
The order, which becomes effective Jan. 1, also requires that parties that contract with those federal agencies—such as private health plans that participate in Medicare programs—also adopt standards-based, interoperable systems as they upgrade or implement health IT.
Bush two years ago set out the goal for most Americans to have electronic health records by 2014. Yet progress has been slow, and these orders won't necessarily affect the private-practice doctors who have been most reluctant to adopt electronic-medical records. Some federal agencies, such as the VA, have in fact been aggressive adopters of health IT.
"These orders won't have much effect at all," predicts Stephen Davidson, professor of health-care management and management policy of Boston University's School of Management.
Specifically, the order says that "as each agency implements, acquires, or upgrades health information technology systems used for the direct exchange of health information between agencies and with non-Federal entities, it shall utilize, where available, health information technology systems and products that meet recognized interoperability standards."
In addition to those requirements, the executive order also requires that the agencies provide enrollees and beneficiaries of federal health-care programs with "transparency" regarding costs and quality of health-care services. Specifically, the public should have access to cost and quality information regarding the common health services the agencies pay for.
The last health-IT related executive order signed by Bush was in April 2004, when he ordered the creation of a federal health IT czar—the position of national health information technology coordinator. That position was held for two years by Dr. David Brailer, who resigned in the spring. A replacement hasn't been named yet.
The new executive order adds another level of formality in the push for IT adoption to reduce costs and improve quality of care. In addition to the various moves by the White House, a number of health-IT related bills have been introduced in the Senate and House of Representatives over the last year or so, including a health IT bill approved last month by the House.
Many doctors are unconvinced that they should buy costly systems that require many process changes, when the financial benefits from processing efficiency and even improved care don't flow to them.
"The missing link is that not many doctors have systems that talk to each other," says Davidson of Boston University, "and even if the software works in the way it's supposed to work, the benefits of those systems go to the payers and the patients."
Mass. to use Microsoft Office in ODF plan
As expected, Louis Gutierrez, chief information officer of Massachusetts' Information Technology Division, on Wednesday sent a letter seen by CNET News.com to advocates of people with disabilities. The letter was in response to their concerns about the commonwealth's plan to move to the OpenDocument format, or ODF, standard.
In addition, Gutierrez last week wrote to the state's Information Technology Advisory Board with an update on the OpenDocument format implementation plan, as had been planned.
Last year, Massachusetts caught international attention for its decision to standardize by January 2007 on ODF, a document format standard not supported in Microsoft Office.
Disability unfriendly?
The move was criticized by disability-rights groups, which complained that going to ODF-compliant products, such as the open-source OpenOffice suite, would not adequately address their needs. In general, Microsoft Office has better assistive technologies, such as screen enlargers.
Earlier this year, Massachusetts' IT division said it would adjust the dates of the OpenDocument adoption if the state could not find adequate accessibility products.
In his letter to disability-rights groups, Gutierrez said emerging Microsoft Office plug-ins will enable Massachusetts to stick to its standardization policy while meeting accessibility needs. Plug-ins act as converters, enabling people to open and save documents in the OpenDocument format from Microsoft Office.
"This approach to ODF implementation will fulfill our legal and moral obligations to the community of people with disabilities, acknowledges the practical requirements of implementation and enables the Executive Department to continue to pursue the benefits of using open standards for information technology," Gutierrez wrote.
The state had considered adopting other office suites, such as OpenOffice and StarOffice, but Gutierrez decided against those because they would not support accessibility requirements by the January 2007 target date.
State executive branch agencies will take a phased approach to using a plug-in. Gutierrez did not indicate which plug-in the state intends to use but that he expects them to be fully functional by 2007.
"Early adopter" agencies, including the Massachusetts Office on Disability, will use a selected plug-in starting in December of this year. The IT division will then move all executive branch agencies in phases to the OpenDocument standard by June of next year.
Gutierrez added that the state will consider OpenDocument format-compliant Microsoft Office alternatives as they become more mature.
Not "anti"-vendor
In his letter to the state's Information Technology Advisory Board, Gutierrez referred to the economic and political factors that have weighed on the state's planned move to ODF.
His predecessor, former CIO Peter Quinn, and other ITD officials were faulted by a state senate oversight committee for, among other things, not providing an adequate cost-benefit analysis. Meanwhile, Microsoft executives argued that the OpenDocument format favored the open-source business model over Microsoft's closed-source model.
Gutierrez told Massachusetts officials that keeping Microsoft Office on state desktops enables the state to "thread the needle" by adhering to a document standard created and supported by multiple software providers without being opposed to, "anti," any one vendor.
Because Microsoft Office and the forthcoming Office 2007 do not support OpenDocument natively, many expected the state to move to a different productivity suite.
Keeping Office, however, makes the ODF implementation more economical and less disruptive to end users, Gutierrez wrote to state officials. Microsoft started its own OpenDocument format plug-in effort earlier this year by sponsoring an open-source project.
"Technology that did not exist at the time of the policy formulation--namely various plug-in or translator components that can be added to Microsoft Office to allow it to read/write to OpenDocument format (ODF)--is at the heart of our near-term approach," Gutierrez said.
femtocells - small cellular base stations
Thursday, 24 August 2006
When McDonalds started installing WiFi hotspots in its restaurants it provoked concerns about exposing its customers to radiation. That could be nothing compared to what will happen when the much maligned cellular base stations start appearing as 'femtocells' in residential and corporate environments.
According to ABI Research, in the near future, femtocells - small cellular base stations designed for use in residential or corporate environments - will be adopted by mobile operators with great enthusiasm. It forecasts that, by 2011 there will be 102 million users of femtocell products on 32 million access points worldwide.
The research company says operators will embrace femtocells because of "greater network efficiency, reduced churn, better in-building wireless coverage and the abilities to shape subscriber data usage patterns and to build platforms upon which fixed-mobile convergence services can be realised."
"Femtocells offer many benefits to operators," according to principal analyst Stuart Carlaw. "From a technological standpoint, their better in-building coverage for technologies such as WCDMA and HSDPA is an incredibly important aspect of service delivery. From a strategic and financial standpoint, the routing of traffic through the IP network significantly enhances network quality and capacity, and reduces the opex that carriers expend on backhaul."
The most interesting characteristic of femtocells, according to Carlaw, is that they can form the basis of a viable option for realising converged fixed-mobile services. "They give operators a cost-effective way to support fixed-mobile substitution, as well as a platform in the home upon which additional features such as Wi-Fi and IPTV can be layered."
However, Carlaw adds a note of caution: "This is a very nascent market and as such there is a pressing need for some standardisation, or at least a common recognition of what a femtocell's minimum requirements should be."
The development of femtocells could also open up a new front in what many see as a looming battle between operators of WiMAX and cellular technologies.
In July, UK chip maker, picoChip, and Korea Telecom formed a partnership to develop home-base stations, otherwise know as 'femtocells' conforming to the mobile WiMAX standard IEEE 802.16e-2005 and its Korean variant, WiBro, which KT is now using to launch a commercial network in Korea.
According to picoChip, "By enabling cost-effective deployment [femtocells] allow carriers to complete with UMA or voice-over-WiFi."
picoChip noted that the femtocell concept was being recognised in the 3G community, but said that, as the carrier most advanced in deploying WiBro, KT was the first to develop such a program for WiMAX. "WiBro, the South Korean wireless telecommunications standard, is compatible with WiMAX 802.16e-2005 (mobile WiMAX), and aims to be the technology that delivers personal broadband to consumers around the globe."