Microsoft has finally conceded support for the OpenDocument Format in its Office products. This Thursday, the company will release a sponsored piece of software on SourceForge.net called the Open XML Translator, which will convert documents between Microsoft's own Open XML format and the OpenDocument Foundation's OpenDocument Format (ODF).
The plug-in will allow users to open and save documents in the ODF format, and as an added bonus it will work with older versions of Microsoft Office, not just the new 2007 version. Being developed by a French partner company called Clever Age, Microsoft's goal is to have the plug-in available by the end of this year, and have it working with Excel and PowerPoint at some point in 2007. What made Microsoft decide that ODF support was important? Microsoft's general manager of interoperability and standards, Tom Robertson, has the answer.
"We're hearing that (customers) don't want homogeneity–they want diversity; they want translatability. And some customers are saying they would like us to focus on this to a certain extent, to make sure the product is high quality."
Although everything sounds peachy, one major catch is that some Office features won't be available when users choose to save a document with the ODF setting. This is mainly because of the difference in formats between ODF and Open XML. Nevertheless, this development is a major step for Microsoft in the world of Free and Open Source Software, especially since the company has chosen to use SourceForge rather than its own CodePlex site to host the plug-in.
Once the Open XML Translator plug-in is finalized at the end of the year, it will be interesting to see how it stacks up next to the ODF Plugin developed by the OpenDocument Foundation Inc. In any case, Microsoft has finally realized that ODF is growing in popularity and that the safest bet is to support both Open XML and ODF. It's all about choice, right? My question to you is whether you like having the two different formats, or would you rather see Microsoft work closely with the OpenDocument Foundation in order to come up with one ruling format for the future?
Our understanding of vertebrate genetics and development has been helped immensely by the ability to target genes for deletion in the mouse, a technique called knockout. Once one copy of a gene is eliminated, animals can be bred that lack the gene entirely, allowing the gene's role in development, behavior, and health to be assessed in an organism that's closely related to humans. Many of the genes identified as being important in other organisms or via biochemistry have since been knocked out. But this technique has its limits, not the least of which is that there's a tendency to only knockout the genes that we expect will be interesting, and labs wind up racing to be the first to knock out the most interesting genes.
Last week's edition of Science takes a look at the state of the art in mouse knockouts. One article looks at some decisions involved in the NIH's program to create a publicly accessible collection of knockouts in every single gene of the mouse genome. The state of Texas had funded a combined public/private consortium that hoped to get a big slice of the NIH work, but seems to have missed out on the money. An official explanation of why hasn't been released yet, but speculation suggests that concerns exist about both the technology used and the accessibility of the mice produced by the Texas group. Meanwhile, in China, Fudan University hopes to get in on the NIH action via a partnership with Yale University. Instead of targeted knockouts, these researchers are getting transposons (mobile genetic elements) to hop around the mouse genome and searching for cases where they have hopped into genes. Although not as directed as targeted knockouts, the ease of generating transposon hops allows large numbers to be screened, potentially making up for a lack of efficiency.
The final piece looks at why the NIH effort and a parallel project in Europe seem to be necessary: rampant inefficiency. It quotes a researcher that has attempted to obtain a number of mice that have been generated by others:
"Once I requested a mouse, and the guy wanted everyone from himself to his grandmother to be a co-author on everything we published with that mouse," says Aguzzi. "It was like scientific prostitution." Another time, he says, a researcher promised him a mouse but took more than a year to deliver: "[The investigator] should have just said his cat ate it; it would have saved us a lot of trouble."
But some mutations are far too easy to obtain. Of the over 11,000 genes knocked out, over 700 had been targeted at least three times by separate research groups, and one had been hit 11 times. Over a quarter of the mice aren't in publicly accessible collections.
Even assuming better public access can be achieved, a number of significant problems will remain. Several companies have set up mouse knockout services, and have patented the specific deletions they offer. Aside from IP issues, there are biological complications. Animals with a developmental phenotype that survives to birth will be easy to identify; behavioral defects or embryonic death can be much more challenging to characterize, and it's unclear how well that will work out on the scales of these projects. There's also the question of technique. Knockouts can be complete deletions, can have a marker inserted in place of the gene, and/or can be set up to be deleted in specific tissues or have the deletion induced by a specific drug treatment. Different programs are focusing on different techniques. Meanwhile, an increasing number of researchers are focusing on the classical genetic approach of randomly mutating mice and looking for interesting phenotypes before going back and identifying the genes (an effort I've been involved in). Each technique has its benefits, so I expect that integrating all of these mouse sources and generating a consensus on how to push forward in an organized way will be extremely challenging.
A while back I wrote an article about a specific form of hydrogen that is known to exist in interstellar gas clouds. Now I am pleased to present present some results along the same lines, which suggest that we know even less than we thought we did. These clouds form a reaction chamber which is quite unlike anything we have on earth. The temperature is very low so the energy to start a reaction cannot be obtained by simply banging the atoms together. Instead, the driving force behind interstellar chemistry comes from the ultraviolet radiation emitted by surrounding stars. However, this produces an unusual set of conditions for a reaction to proceed. For instance an ultraviolet photon may ionize a species by stripping off an electron. If it then collides with another atom which is also in a pre-prepared state, a reaction may occur. However, it can just as easily collide with an electron in which case it returns to a neutral state. Thus, many strange intermediate species of hydrogen molecules are possible.
One of these talked about species is H2-, which is essentially two protons and three electrons. This species is important, since it is expected that the production and decay rate of this species will dominate the equilibrium state of such clouds. Despite this, numerous attempts to accurately detect the existence of the species have failed. Now a laboratory experiment has both confirmed that such a anion does exist and measured how long it lasts before falling apart. The answer turned out to surprise everyone, H2- survives for about 8 microseconds. The theoretical predictions for such a beast are pretty simple; like a child's block tower it falls apart as soon as it can, giving it a lifetime of just half a microsecond.
Interstellar chemistry is throwing up a few surprising results, which show we don't understand the underlying fundamental physics and chemistry as well as we should.
Having the public use your company name as a verb is, in one way, a marketer’s Shangri-la. Many companies would kill for the name recognition and popularity of “xerox,” “google,” and “hoover.” For the companies themselves, though, being “verbed” has its dark side. A company that does not defend its trademark risks losing it when it becomes a common figure of speech, which explains why Google is not happy about its recent inclusion in a new Merriam-Webster dictionary.
Merriam-Webster’s Collegiate Dictionary now includes the word “google,” which it defines as a verb (prediction: within a decade, it will also be an expletive, as in “go google yourself!”). Sensitive to the search engine’s concerns (and to its legal team), the dictionary does note that the word is trademarked. Dictionaries, news outlets, and websites that do not note this risk receiving a strongly-worded letter asking them to either remove the offending use of the term or note that it is trademarked.
Right now, “to google” actually means to use the Google search engine. Google no doubt fears that the world will go the way of “hoover,” which has become a generic term for “vacuum” in the UK. If “google” ever comes to mean “I searched for it on the Internet,” the company’s careful branding and promotion will be diluted and the name will lose value.
“Google” joins other up-and-comers in the new dictionary, sharing page space with words like “himbo” (male bimbo) and “mouse potato” (think couch potato, but with a computer). Although it has just made it into the dictionary, “google” has been on linguists’ radar screens for years. In 2002, for instance, it was judged the “most useful” new word of the year by the American Dialect Society.
Before getting our first TiVo a couple of years ago, we were faced with a choice: either the world’s best-known DVR or a media center PC. A home theater PC was attractive because of its versatility, but the thought of having a PC in the living room proved to be a turnoff to my better half, so the TiVo won out. Despite the inroads that HTPCs have made into the living rooms of the world, more traditional DVRs and cable set-top boxes are still far more popular.
That may change over the next few years, according to a study by ABI Research. The research firm is predicting heavy growth ahead for media center PCs and associated hardware, up to US$44.8 billion by 2011 from its current US$3.7 billion level. "With the arrival of faster in-home digital networking technologies such as MoCA, an industry-accepted framework for networked digital media distribution in DLNA, and the increase in both pay-TV and Internet content moving over in-home networks, the home media server is becoming a key beachhead in the digital home," according to principal analyst Michael Wolf.
The research firm calls the amalgamation of media center PCs, PVRs, gaming consoles, network attached storage (NAS) devices, and set-top boxes "digital media servers." As consumers acquire more and more content in a purely digital form (i.e., without physical media), the use of digital media servers to store and move the content around the house or onto portable devices will take on greater importance.
It’s a familiar problem for those of us who have embraced digital content distribution. Here’s an example: say that in a couple of years, you are able to download a high-quality, full-length Hollywood blockbuster from Netflix to your PC. Do you want to sit at the computer desk and watch the movie on your 23" LCD or would you rather stream it into the family room so you can take it in on your 42" plasma display and Surround Sound system? If services like full-movie downloads are truly going to take off, producers and distributors will have to overcome the problem of in-home storage and distribution with a mass-market-friendly solution (assuming they ever get that pesky DRM thing solved).
There’s one cloud on ABI Research’s rosy horizon: CableCARD. Wolf believes that Vista’s support for CableCARD will help move HTPCs further into the mainstream. However, CableCARD’s marriage to the PC may never be fully consummated. First, do-it-yourselfers will be left out of the HTPC/CableCARD fun. For the foreseeable future, only OEMs like HP and Dell will be certified for use with CableCARD—and those systems will be using CableCARD 1.0. That means no support such niceties as video on demand, interactive services, or multistreaming—and it won’t be compatible with version 2.0. In fact, there is no guarantee that Vista will ever support CableCARD 2.0, which could limit the potential for widespread adoption of media center PCs.
The potential for growth in the media center PC, NAS, and other home digital media storage and distribution markets is there, but if the stars don’t align, it may never come to pass. The broadcast flag, while popular with content producers, could have a chilling effect on innovation in the consumer electronics market. If legislators can resist media industry pressure to enact the broadcast flag and the market is allowed grow unencumbered by government regulation, the future for HTPCs and home media networks may be as bright as some believe.
Among the many improvements offered by 10.4.7, one new feature isn't likely to be eagerly welcomed by users. One sharp-eyed blogger noticed that 10.4.7 has been phoning home to Apple, as often as twice within a seven-hour period. What is 10.4.7 so busy reporting? The answer is as confusing as it is seemingly innocuous:
You can now verify whether or not a Dashboard widget you downloaded is the same version as a widget featured on (www.apple.com) before installing it.
It seems that Apple added a process called "dashboardadvisoryd" that phones home for two widget-related URLs:
The first appears to be a public key or something. The second appears to be empty but its header values may convey something of interest to Apple’s client.
While the dashboard advisory process doesn't seem to be sending much personal information, it also can't be easily turned off, which sets a bad precedent.
Privacy is a touchy subject right now. In the United States, phone calls are being monitored, bank accounts are being monitored, and the recent firestorm over Windows Genuine Advantage shows that people are pretty fed up with it all. Is this really the best time for OS X to start contacting the mothership? If so, at least offer some features that are a reasonable tradeoff for the loss of privacy and let people opt-out, please! Really, how often have you laid in bed at night fretting that perhaps you downloaded a widget that was a different version from the one on Apple's site? If that's all this process is doing, it feels rather like killing a gnat with a bazooka (specifically, a bazooka that shoots bad PR in every direction).
Perhaps some of our readers have ideas on why Apple would solve this particular problem with a fairly controversial solution?
Intel Capital has been busy. We reported yesterday on their multimillion dollar investment in videogame advertising, but that amount is pocket change compared to the US$600 million Intel has just invested in Clearwire. The investment is the largest ever for Intel Capital, and it signals the company’s continued interest in the WiMAX-style technology pushed by Clearwire.
Who is Clearwire? The firm, headquartered up in Washington state, offers wireless Internet access based on the IEEE 802.16e-2005 standard, with plans to adopt full WiMAX compatibility when the technology finally gets up and running. They already operate networks along the West coast and in Texas, Florida, North Carolina, Denmark, and Mexico. Connection speeds top out at 1.5Mbps and the company says that monthly charges range from US$30 to US$37, with an additional five-spot a month for the modem rental.
Clearwire plans to use the massive cash infusion to help fund a nationwide build-out of its technology, hopefully getting a jump on more established telecommunications companies which have not yet fully committed to WiMAX deployments. Intel wants to push the technology anyway it can, since it hopes to be a major manufacturer of WiMAX chipsets. The first WiMAX laptop cards should be ready by the end of this year, and Intel has already announced plans to incorporate the new technology into its Centrino platform.
While Clearwire races to expand its reach, it faces competition from the established players in the industry. Rupert Murdoch of News Corp. has perhaps been most vocal about his interest in building a national WiMAX network in the US, a move that could suddenly secure the company a major role in Internet distribution (though such a move would have its own difficulties).
Given all the interest that now surrounds WiMAX, it looks like the long-delayed technology will at last get its day in the sun. If it turns out to offer reliable broadband speeds without also requiring a telephone line or cable TV connection, expect this to be a deeply disruptive technology over the next decade.
We all remember collecting giant stacks of “free” AOL CDs (and for the really old techies among us, free AOL floppy disks, which could at least be reformatted and used for something else). Now it turns out that AOL may be offering something else for free: access to their Internet service.
Those still on dial-up (yes, they are out there) will continue to pay a fee to use the AOL service. However, subscribers to the “AOL for Broadband” service, where you bring your own high-speed Internet connection to the party, will no longer have to pay subscription fees. The plan, which could see AOL losing up to US$2 billion in revenue from subscription fees, is intended to boost AOL’s numbers and ultimately the company’s advertising revenue. AOL hopes that up to 8 million of its existing dial-up customers will take advantage of the offer. An earlier attempt to move people from dial-up to broadband by increasing dial-up access prices met with limited success.
The proposed change comes at a crucial point for AOL. While the company still has an impressive number of subscribers, they are losing customers rapidly as broadband access becomes more widespread. In 2002, AOL had 26.5 million subscribers in the United States; by 2006 this figure was down to 18.6 million. AOL estimates that the company lost approximately 850,000 members in the first quarter of this year.
The plan to drop access charges from broadband essentially turns AOL from a subscription-based service to an advertising-supported one. While companies like Google and Yahoo have shown that you can make very good money from online advertising bundled with free content, it remains to be seen whether the corporate culture at AOL will adapt well to this change.
According to the Wall Street Journal, the new plan was proposed to top Time Warner executives by AOL CEO Jonathan Miller, in a meeting held last week in New York. Time Warner, which was “acquired” by AOL in a stunning stock swap deal in 2001, right before the dotcom crash, has been distancing itself from its AOL portion. AOL reported a colossal US$99 billion loss in 2002, and in response, Time Warner removed “AOL” from its name and removed Steve Case from his position as executive chairman. Case left the Time Warner board in 2005. The media giant has considered selling its AOL property many times before, but decided to partner with Google instead. It is likely that the partnership with the search engine company provided the impetus to move to an advertising-supported business model.
Electronics have long been recognized as a weak link when it comes to secure conversation. From bugs hidden in lampshades to phone taps to keystroke tracking software, electronics provide the easy path to monitoring and censoring communications. In no area is that so apparent, perhaps, as in text messaging, as some users around the globe are discovering the hard way.
Text messaging and the first level of censorship begins at the phone. While it’s certainly possible to enter any word using the alphabetic method in which a=2, b=2-2, c=2-2-2, d=3 and so on, it isn’t very convenient. This has led manufacturers to develop alternate systems like T9, which make it easier to enter common words. T9 works by using algorithms to determine what word a user is trying to enter. Punching 2-2-8 might default to "cat" for example, since that’s a common word which uses the letters associated with those numbers. It might also give you "bat" however, which is another logical guess based on the letters available through those keystrokes. Usually, a provision is made for selecting words other than the algorithm’s first guess.
Where things start to get hairy is when a user enters something like 3-8-2-5, which can spell either "dual" or a somewhat naughty word which you won’t find in your family newspaper. (Raise your hand if you aren’t looking at a phone right now. I thought so.) In that case, the manufacturer could design the phone to provide the second word as an alternate, or more likely, avoid it altogether. In a nation like the US, avoiding a word which some might find objectionable is a business decision that probably prevents some complaint letters. In other countries, it could be a government mandate, and the banned word might not be 3-8-2-5, but something like "liberty" or "Taiwan."
At first, that sounds inconvenient, yet relatively benign. After all, a user could still switch to alphabetic entry and write anything they want, right? Perhaps, but the second level of control involves monitoring and censoring the messages of users, as the Chinese government has been doing since the SARS outbreak of 2004. At that time, word about the SARS epidemic spread like wildfire despite very little coverage by the government-controlled press. Since realizing the informative power of mobile phones, Chinese authorities have monitored and filtered text messages as a matter of course.
The problem doesn’t begin or end with China. Security agencies in countries as diverse as Iran and Germany have been spotted responding to text messages regarding political leaders or outlawed ideologies. Much of this communication scanning is done with the compliance of the mobile phone providers, which simply consider it the price of doing business in various countries. We’ve seen this before, as in the case of Google and other portals filtering search results to suit local authorities.
The good news is that censoring communication continues to remain something of an arms race. While aficionados of such evil words as "Taiwan" or 3-8-2-5 might sometimes find themselves under scrutiny by the Powers That Be, there’s nothing to stop them from switching to slang which means the same thing. Taiwan might be referred to as "the neighbors," for instance, while 3-8-2-5 could be "frak."
The phone companies’ plans to roll out video services across the nation have been slowed by the need to negotiate individual contracts with most municipalities. AT&T for instance, hit a wall in talks with the Chicago suburb of Roselle, owing to the fact that AT&T neglected to mention that the new lines it wanted to lay would carry video. (AT&T has insisted that its new IPTV service is wholly different from cable television and forms a data service, not a video network.) Now Verizon has run into trouble of its own as it attempts to bring its fiber-optic FiOS service to Montgomery County, Maryland.
The county, located just outside of Washington, DC, has apparently demanded that Verizon reimburse county lawyers and consultants for their time spent on negotiations. This, among other issues, did not sit well with Verizon, which decided to drag the county to court. They plan to argue that the local government’s actions violate federal law. For their part, the county argues that Verizon simply won’t play by long-established rules that govern the cable industry. Chief Administrative Officer Bruce Romer said that the lawsuit “is evidence that Verizon is unwilling to play by the same rules that apply to their cable competitors.”
Verizon and AT&T are newcomers to the world of municipal video franchising, and both appear frustrated by the tedious nature of the negotiations. Imagine having to secure deals with every municipality or county across the nation in which you want to introduce a new video service. Not a pretty picture, is it? On the other hand, municipalities don’t want to sign away valuable access to rights-of-way without a little something in return (Montgomery County, for instance, wanted Verizon to set aside 65 channels for local public access programming).
The telecommunications companies aren’t opposed to sharing the money, exactly, but they don’t want different regulations and contracts in every county. AT&T’s strategy has been to offer municipalities a “memorandum of understanding” that guarantees them five percent of local revenues with no room for negotiation (Roselle did not like this approach). The companies would far prefer to negotiate at the state or federal level and secure uniform franchising agreements.
That wish may be coming true, as recent legislation looks ready to take the matter out of the hands of local authorities. While this approach will mean that cities have fewer options for regulating and negotiating with the telcos, it does mean that the companies can roll out their technology much faster and at lower cost. Whether this ultimately benefits consumers remains to be seen, but it’s not a bad idea in theory.
When IPTV and other new services are at last deployed, the challenge is convincing people to buy them. While cable television is a well-understood technology, IPTV is not. AT&T has taken the unusual step of previewing its service for small groups of consumers in San Antonio, where its U-verse IPTV system first launched. This approach, dubbed “Selling TV like Tupperware” by the Wall Street Journal, tries to convince customers to switch by inviting them into an upscale house and showing them side-by-side televisions, one with IPTV, the other with cable. The company is also seeking out neighborhood leaders who will host similar “TV parties” in their own homes.