Blog
Jan 01 , 2019 / By :

Universal Music Group, a division of Paris-based Vivendi Universal, has just announced plans to revamp the way they sell music CDs in European markets. Rather than serving up every product in the same 20-year-old CD case design, Universal will divide their offerings into three tiers with different packaging and different prices.HangZhou Night Net

At the bottom end, a “basic” package will be introduced, consisting of a simple sleeve made of heavy cardstock. It should sell for €9.99, the same price Apple charges for a full album download from the European iTunes Music Store. The package sounds reminiscent of Neil Young’s Mirror Ball, among other cardbord packages offered up over the years, but it’s not a one-off specialty item. Universal hopes to ship some 100 million of these albums by year’s end, starting from the planned September introduction. The basic package is earmarked for older releases and other slow sellers in the traditional jewel case package, and will come with no liner notes or other extras. According to the Guardian, the no frills concept is a “acknowledgement” of the power of iTunes.

If extra features are what you want, you’ll be more interested in the premium “DeLuxe” packaging at MSRPs around €19.99. That version, aimed at collectors and gift-givers, will feature a beefed-up case and tons of bonus features, ranging from expanded notes and bonus tracks to DVDs filled with behind-the-scenes clips and live performances.

In-between these extremes comes the “Super Jewel Box,” a sturdier version of today’s brittle jewel cases, but otherwise much the same as anything you’d find at your local Wal-Mart today. It’s supposed to sell for €14.99, comparable to today’s average CD price, and will feature new releases and popular back-catalog items.

Universal claims to understand that digital downloads are the future, but then go on to defend the need to boost physical CD sales again. The company’s executive vice-president of international marketing and A&R, a fellow by the notable name Max Hole, says that downloading has renewed interest in older titles, and that all you need to do in order to sell those albums is lower the prices. “We can grow the CD market,” says Mr. Hole. “That might be a little optimistic, but we can certainly slow its decline.”

I can see some value in the new packaging options: premium bundles will always have a following, and my CD rack is littered with broken cases, so the Super Jewel thing sounds like a welcome upgrade. But while I enjoy Mirror Ball immensely and can appreciate the Earth-friendliness of paper packaging, I’m not convinced that pricing is the only problem to solve regarding slow back-catalog sales. Those albums just aren’t promoted, and there’s also a major convenience difference between downloads and plastic discs. Besides, if all you’re getting is a plain cardboard sleeve with cover art, and you just want to rip it to your iPod anyway, what’s your motivation for getting physical? There’s more research to do regarding the true drivers of the digital revolution, and whether it’s more about price, about convenience, or something else entirely.

Jan 01 , 2019 / By :

I was surprised to find that an article on the dry topic of crop yields was surprisingly compelling. It seems to have it all: new information pulled together from a collection of studies that suggests the current consensus was based on an outdated technique. The topic is politically charged, and the editors even let the authors get away with a pun in the title ("Food for Thought"). How does this all fit together? HangZhou Night Net

As atmospheric carbon levels go up and the planet warms up, crops are expected to be impacted in a variety of ways. Increased temperature and CO2 are expected to accelerate growth, but soil moisture will decrease, potentially counteracting these effects. Based on a number of studies, it was expected that these factors would largely balance out, with a slight decrease in crop yield possible. The new study addresses one of these factors: how CO2 affects growth and yield of crops. The authors note that the estimates of these effects being used in assessing the impact of rising carbon levels are based on experiments in enclosed buildings, where it's easy to control the atmosphere. But technology has since improved, and free-air concentration enrichment (FACE) technology allows crops to be grown in the open under controlled atmospheric conditions, more closely approximating real-world conditions.

What happens when you compare the results of FACE experiments with enclosed results? The enclosed experiments produce crop yields that are more than two-fold higher than those produced using FACE technology. This, in turn, suggests that the expectations for the future crop yields may be over-estimates. As the authors note, "This casts serious doubt on projections that rising CO2 will fully offset losses due to climate change." They do, however, wrap up on an optimistic note: crop plants have been selectively bred for a number of properties, and increased growth at higher CO2 levels may be as accessible to breeding as anything else. The same technology that allowed us to recognize the problem may be useful in breeding a correction for it.

Jan 01 , 2019 / By :

Some Apple rumors, like the iTablet, never die, no matter how much time passes or how little sense they make. At the other end of the rumor spectrum are ideas so obvious, like the flash-based iPod Nano, that it is only a matter of time before they become product. There are also rumors that no one believes right up until they happen, like the switch to Intel, and there are the rumors that make sense but never seem to happen. HangZhou Night Net

An Apple spreadsheet would fall into the last category.

Was it only a year ago that AppleInsider was touting "Numbers" as the next big software release from Apple Computer? Yes, it was.

Rumors that Apple Computer has been quietly developing its
own spreadsheet solution gained a dab of credibility this week as
sources pointed to a revealing company filing with the United States
Patent and Trademark Office. Just two days after requesting a trademark on the word 'Mactel,'
which seemingly describes the convergence of Macintosh design with
Intel hardware, Apple on June 8th filed for a standard character mark
on the word 'Numbers.'

It's a little over a year later and a new spreadsheet rumor is out,
except this time ThinkSecret is the messenger, the name is "Charts,"
and it's *confirmed* to be true.

Long rumored—or at least, assumed—to be in development,
sources say Apple is not planning on positioning Charts as a competitor
to Microsoft's Excel, but rather as a more consumer-friendly
spreadsheet application that can handle the needs of home users and
small businesses but not pretend to execute any of the more advanced
functions of Excel.

Presumably, Charts will allow the import of Excel spreadsheets—unless they have advanced functions it cannot pretend to execute. Beyond that, it's anybody's guess. Along with the obvious, ThinkSecret claims development includes nonsensical stuff like Address Book integration. Pricing, which currently is US$79 for iWork, is unknown.

So, what does Charts—if true—mean for Mac users? Not much.

To date, iWork has made no impression as an office suite. Of course, Apple stresses that it is not competing with Office for the Mac, but anyone who has used Pages could have told you that. The only way this could matter is if Apple did what should have been done three years ago when iWork first came out. Apple needs to make iWork free on new Macs, a true replacement for AppleWorks. That would be a rumor worth seeing come true.

Jan 01 , 2019 / By :

At this point in time, if you were interested in trying out Windows Vista Beta 2, then you definitely have already. Though some of us don't realize it right away, Microsoft has two goals in mind when it releases a public beta: give the tech community a preview of the new operating system, and more importantly, gather massive amounts of bugs and feedback about the new OS. HangZhou Night Net

Focusing on the second goal, customer feedback, one way Microsoft allows testers to report bugs is through the Microsoft Connect website. Besides reporting, testers are also allowed to view, validate, and search for issues. The other night, Robert McLaws of Longhorn Blogs was viewing and commenting on some bugs when he decided that he'd like to see a statistical analysis of the some 28,700 Vista bugs in the system. Over four and a half hours, McLaws manually added every bug posted on Connect to an Excel 2007 spreadsheet, and he discovered some fantastic trends.

Once McLaws removed 1,072 duplicate items, he started exhuming statistics from the raw data. Some of his intriguing findings include:

An average of 81 bugs per day are reported for VistaThe count of bugs per day is increasing, not decreasingAround 200 bugs are reported within the first 24 hours of a new releaseOver 20,000 bugs have been closed so far where closed holds a status of "Closed" or "Resolved"When Microsoft released the Start Orb, 353 bugs were added to Connect during the first day and 338 during the secondOnly 1/5 of the total bugs submitted are still open

One item above that is worth discussing is the increase in the number of bugs per day. While one could conclude that an increase in bugs per day is a Bad Thing, McLaws notes that more and more people are using the system now than ever before, especially with the public beta and all. As a software engineer myself, I agree with McLaws; the more hands on a system, the more problems will be uncovered. Once the company releases RC1 to the public, we could expect another drastic spike in bug count, although the items should be more in-depth than and not as obvious as bugs reported during Beta 2.

Overall, Microsoft still has a ways to go before it has an operating system that can be sold and proudly delivered, but we knew that even without pretty graphics and some statistical breakdowns. Just like every piece of software that has ever shipped, Vista will ship with some bugs intact. Let's just hope Microsoft knocks out the big boys and doesn't leave them for a service pack instead.

Jan 01 , 2019 / By :

Originally designed for the GNU Image Manipulation Program, the GTK graphical application development toolkit provides an extensive assortment of widgets and controls for cross-platform software construction. The latest version, GTK 2.10.0, has been officially released. With plenty of exciting new features for users and developers, GTK 2.10.0 is a significant improvement over previous versions. The popular, cross-platform toolkit is widely used on a variety of platforms, and provides the foundation for the GNOME desktop environment. Available under the GNU’s highly permissive LGPL license, GTK has been adopted by numerous proprietary and open source software developers. HangZhou Night Net

After receiving a steady litany of complaints about the absence of a visible file textbox in the GTK file chooser dialog, the GTK developers have finally relented and integrated a location entry. GTK 2.10 also includes long-awaited support for drag-and-drop tab reordering, a feature that has been independently implemented in virtually every major GTK application including the GNOME terminal, Gaim, Firefox, and Gedit. Inclusion of tab reordering in GTK will eliminate the need for a lot of redundant code, and it will ensure that tab reordering looks and feels consistent in all GTK applications. GTK 2.10 also includes many improvements to printing functionality, including a new cross-platform compatible, high-level printing API that will simplify a few of the challenges associated with maintaining portable GTK applications.

The theme system has received massive improvements in GTK 2.10. GTK now allows theme developers to use symbolic colors, a feature which could finally facilitate utilization of multiple color schemes with a single theme. This highly desirable feature will enable users to customize the colors used by GTK themes without having to alter the theme itself. New style properties have been added to a number of widgets, including tabs, menus, trees, and buttons.

GTK 2.10 includes several improvements to GDK, the portable drawing toolkit used by GTK, including an experimental, native OS X GDK backend that will eventually make it possible for GTK apps to run on OS X without X11. An experimental framebuffer GDK backend is also available in this release. A new function has been added that will enable GTK applications to detect the presence of a compositing manager like XGL, possibly a prelude to more extensive integration of translucency in various GNOME applications, like real transparency in the GNOME terminal.

I use GTK and the GNOME libraries for many of my own development projects, particularly for simple utilities. The Ruby and Python bindings for GTK are great, and extremely useful for rapid application development. Released earlier this week, the latest version of the Ruby GNOME bindings include support for the poppler PDF rendering library, and the VTE library used by the GNOME terminal. Look forward to GTK 2.10 in the upcoming GNOME 2.16 release!

Jan 01 , 2019 / By :

Microsoft has finally conceded support for the OpenDocument Format in its Office products. This Thursday, the company will release a sponsored piece of software on SourceForge.net called the Open XML Translator, which will convert documents between Microsoft's own Open XML format and the OpenDocument Foundation's OpenDocument Format (ODF). HangZhou Night Net

The plug-in will allow users to open and save documents in the ODF format, and as an added bonus it will work with older versions of Microsoft Office, not just the new 2007 version. Being developed by a French partner company called Clever Age, Microsoft's goal is to have the plug-in available by the end of this year, and have it working with Excel and PowerPoint at some point in 2007. What made Microsoft decide that ODF support was important? Microsoft's general manager of interoperability and standards, Tom Robertson, has the answer.

"We're hearing that (customers) don't want homogeneity–they want diversity; they want translatability. And some customers are saying they would like us to focus on this to a certain extent, to make sure the product is high quality."

Although everything sounds peachy, one major catch is that some Office features won't be available when users choose to save a document with the ODF setting. This is mainly because of the difference in formats between ODF and Open XML. Nevertheless, this development is a major step for Microsoft in the world of Free and Open Source Software, especially since the company has chosen to use SourceForge rather than its own CodePlex site to host the plug-in.

Once the Open XML Translator plug-in is finalized at the end of the year, it will be interesting to see how it stacks up next to the ODF Plugin developed by the OpenDocument Foundation Inc. In any case, Microsoft has finally realized that ODF is growing in popularity and that the safest bet is to support both Open XML and ODF. It's all about choice, right? My question to you is whether you like having the two different formats, or would you rather see Microsoft work closely with the OpenDocument Foundation in order to come up with one ruling format for the future?

Jan 01 , 2019 / By :

Our understanding of vertebrate genetics and development has been helped immensely by the ability to target genes for deletion in the mouse, a technique called knockout. Once one copy of a gene is eliminated, animals can be bred that lack the gene entirely, allowing the gene's role in development, behavior, and health to be assessed in an organism that's closely related to humans. Many of the genes identified as being important in other organisms or via biochemistry have since been knocked out. But this technique has its limits, not the least of which is that there's a tendency to only knockout the genes that we expect will be interesting, and labs wind up racing to be the first to knock out the most interesting genes. HangZhou Night Net

Last week's edition of Science takes a look at the state of the art in mouse knockouts. One article looks at some decisions involved in the NIH's program to create a publicly accessible collection of knockouts in every single gene of the mouse genome. The state of Texas had funded a combined public/private consortium that hoped to get a big slice of the NIH work, but seems to have missed out on the money. An official explanation of why hasn't been released yet, but speculation suggests that concerns exist about both the technology used and the accessibility of the mice produced by the Texas group. Meanwhile, in China, Fudan University hopes to get in on the NIH action via a partnership with Yale University. Instead of targeted knockouts, these researchers are getting transposons (mobile genetic elements) to hop around the mouse genome and searching for cases where they have hopped into genes. Although not as directed as targeted knockouts, the ease of generating transposon hops allows large numbers to be screened, potentially making up for a lack of efficiency.

The final piece looks at why the NIH effort and a parallel project in Europe seem to be necessary: rampant inefficiency. It quotes a researcher that has attempted to obtain a number of mice that have been generated by others:

"Once I requested a mouse, and the guy wanted everyone from himself to his grandmother to be a co-author on everything we published with that mouse," says Aguzzi. "It was like scientific prostitution." Another time, he says, a researcher promised him a mouse but took more than a year to deliver: "[The investigator] should have just said his cat ate it; it would have saved us a lot of trouble."

But some mutations are far too easy to obtain. Of the over 11,000 genes knocked out, over 700 had been targeted at least three times by separate research groups, and one had been hit 11 times. Over a quarter of the mice aren't in publicly accessible collections.

Even assuming better public access can be achieved, a number of significant problems will remain. Several companies have set up mouse knockout services, and have patented the specific deletions they offer. Aside from IP issues, there are biological complications. Animals with a developmental phenotype that survives to birth will be easy to identify; behavioral defects or embryonic death can be much more challenging to characterize, and it's unclear how well that will work out on the scales of these projects. There's also the question of technique. Knockouts can be complete deletions, can have a marker inserted in place of the gene, and/or can be set up to be deleted in specific tissues or have the deletion induced by a specific drug treatment. Different programs are focusing on different techniques. Meanwhile, an increasing number of researchers are focusing on the classical genetic approach of randomly mutating mice and looking for interesting phenotypes before going back and identifying the genes (an effort I've been involved in). Each technique has its benefits, so I expect that integrating all of these mouse sources and generating a consensus on how to push forward in an organized way will be extremely challenging.

Jan 01 , 2019 / By :

A while back I wrote an article about a specific form of hydrogen that is known to exist in interstellar gas clouds. Now I am pleased to present present some results along the same lines, which suggest that we know even less than we thought we did. These clouds form a reaction chamber which is quite unlike anything we have on earth. The temperature is very low so the energy to start a reaction cannot be obtained by simply banging the atoms together. Instead, the driving force behind interstellar chemistry comes from the ultraviolet radiation emitted by surrounding stars. However, this produces an unusual set of conditions for a reaction to proceed. For instance an ultraviolet photon may ionize a species by stripping off an electron. If it then collides with another atom which is also in a pre-prepared state, a reaction may occur. However, it can just as easily collide with an electron in which case it returns to a neutral state. Thus, many strange intermediate species of hydrogen molecules are possible. HangZhou Night Net

One of these talked about species is H2-, which is essentially two protons and three electrons. This species is important, since it is expected that the production and decay rate of this species will dominate the equilibrium state of such clouds. Despite this, numerous attempts to accurately detect the existence of the species have failed. Now a laboratory experiment has both confirmed that such a anion does exist and measured how long it lasts before falling apart. The answer turned out to surprise everyone, H2- survives for about 8 microseconds. The theoretical predictions for such a beast are pretty simple; like a child's block tower it falls apart as soon as it can, giving it a lifetime of just half a microsecond.

Interstellar chemistry is throwing up a few surprising results, which show we don't understand the underlying fundamental physics and chemistry as well as we should.

Jan 01 , 2019 / By :

Having the public use your company name as a verb is, in one way, a marketer’s Shangri-la. Many companies would kill for the name recognition and popularity of “xerox,” “google,” and “hoover.” For the companies themselves, though, being “verbed” has its dark side. A company that does not defend its trademark risks losing it when it becomes a common figure of speech, which explains why Google is not happy about its recent inclusion in a new Merriam-Webster dictionary.HangZhou Night Net

Merriam-Webster’s Collegiate Dictionary now includes the word “google,” which it defines as a verb (prediction: within a decade, it will also be an expletive, as in “go google yourself!”). Sensitive to the search engine’s concerns (and to its legal team), the dictionary does note that the word is trademarked. Dictionaries, news outlets, and websites that do not note this risk receiving a strongly-worded letter asking them to either remove the offending use of the term or note that it is trademarked.

Right now, “to google” actually means to use the Google search engine. Google no doubt fears that the world will go the way of “hoover,” which has become a generic term for “vacuum” in the UK. If “google” ever comes to mean “I searched for it on the Internet,” the company’s careful branding and promotion will be diluted and the name will lose value.

“Google” joins other up-and-comers in the new dictionary, sharing page space with words like “himbo” (male bimbo) and “mouse potato” (think couch potato, but with a computer). Although it has just made it into the dictionary, “google” has been on linguists’ radar screens for years. In 2002, for instance, it was judged the “most useful” new word of the year by the American Dialect Society.

Jan 01 , 2019 / By :

Before getting our first TiVo a couple of years ago, we were faced with a choice: either the world’s best-known DVR or a media center PC. A home theater PC was attractive because of its versatility, but the thought of having a PC in the living room proved to be a turnoff to my better half, so the TiVo won out. Despite the inroads that HTPCs have made into the living rooms of the world, more traditional DVRs and cable set-top boxes are still far more popular.HangZhou Night Net

That may change over the next few years, according to a study by ABI Research. The research firm is predicting heavy growth ahead for media center PCs and associated hardware, up to US$44.8 billion by 2011 from its current US$3.7 billion level. "With the arrival of faster in-home digital networking technologies such as MoCA, an industry-accepted framework for networked digital media distribution in DLNA, and the increase in both pay-TV and Internet content moving over in-home networks, the home media server is becoming a key beachhead in the digital home," according to principal analyst Michael Wolf.

The research firm calls the amalgamation of media center PCs, PVRs, gaming consoles, network attached storage (NAS) devices, and set-top boxes "digital media servers." As consumers acquire more and more content in a purely digital form (i.e., without physical media), the use of digital media servers to store and move the content around the house or onto portable devices will take on greater importance.

It’s a familiar problem for those of us who have embraced digital content distribution. Here’s an example: say that in a couple of years, you are able to download a high-quality, full-length Hollywood blockbuster from Netflix to your PC. Do you want to sit at the computer desk and watch the movie on your 23" LCD or would you rather stream it into the family room so you can take it in on your 42" plasma display and Surround Sound system? If services like full-movie downloads are truly going to take off, producers and distributors will have to overcome the problem of in-home storage and distribution with a mass-market-friendly solution (assuming they ever get that pesky DRM thing solved).

There’s one cloud on ABI Research’s rosy horizon: CableCARD. Wolf believes that Vista’s support for CableCARD will help move HTPCs further into the mainstream. However, CableCARD’s marriage to the PC may never be fully consummated. First, do-it-yourselfers will be left out of the HTPC/CableCARD fun. For the foreseeable future, only OEMs like HP and Dell will be certified for use with CableCARD—and those systems will be using CableCARD 1.0. That means no support such niceties as video on demand, interactive services, or multistreaming—and it won’t be compatible with version 2.0. In fact, there is no guarantee that Vista will ever support CableCARD 2.0, which could limit the potential for widespread adoption of media center PCs.

The potential for growth in the media center PC, NAS, and other home digital media storage and distribution markets is there, but if the stars don’t align, it may never come to pass. The broadcast flag, while popular with content producers, could have a chilling effect on innovation in the consumer electronics market. If legislators can resist media industry pressure to enact the broadcast flag and the market is allowed grow unencumbered by government regulation, the future for HTPCs and home media networks may be as bright as some believe.