The noose is tightening around Russian website Allofmp3.com. The record industry, which now has the site on its radar screen, won a small victory in the UK this week that will allow it to file suit against the site. The British Phonographic Industry (the UK’s version of the RIAA) received permission from London’s High Court to “serve proceedings” against the website. When that happens, the Russian judicial system will be obligated by international agreement to look into the matter, which means another legal headache is developing for Allofmp3.
The site already has to contend with two legal cases against its director and former director, and additional pressure from the UK won’t make things any easier for a service that finds itself in the crosshairs of the international community’s Piracy Sniper Rifle. The Americans have been leaning hard on Russia to do something about the site, but Allofmp3 just keeps chugging along, offering up new Dashboard Confessional, Keane, and Red Hot Chili Peppers for under US$2.
The site has weathered Russian legal scrutiny before, but the newly politicized claims from abroad could make it harder to stay in business. Allofmp3 claims to operate with a valid license, of course, but none of the money they make trickles back to artists or labels abroad. The BPI plans to argue that even if the license is legal under Russian law, it is certainly not legal in the UK, where Allofmp3 now accounts for 14 percent of all legal downloads.
Despite the emphasis that the music industry often places on sites like Allofmp3.com, it’s worth remembering that most music found on portable players is legitimate.
Are the site’s days numbered? It’s too soon to tell, but prudence would suggest that if you have any credit at the site, now’s the time to use it.
I've said it before, I'll say it again: brains are strange things. The fact that we, as humans, have comparatively large and complex brains accounts for our dominance on the planet, our language, technology and all the other things that make it possible for me to sit in front of my powerbook and write this column, and for you to read it. But there's much that happens within our skulls that we don't fully understand yet, and by and large brains are fragile things. It doesn't take much to stop one working–a few minutes without oxygen is enough to destroy a brain.
There is a common assumption that brains are also not very good at repairing themselves. By and large this is true. Unlike the liver, which has an enormous capacity for self-repair, neurons that die are often never replaced, making spinal cord or traumatic brain injuries permanent conditions. But not always. Such is the remarkable case of Terry Wallis, who suffered a traumatic brain injury that left him in a minimally conscious state (MCS) for 19 years.
Over this time, Terry's brain slowly rewired itself in a way that has not been observed before, and Terry emerged from his MCS able to speak and with a limited ability for movement that has since improved. It must be stressed that the Hollywood image of a man waking from a coma and taking up where he left off could not be further from reality in Terry's case. Despite regaining the power of speech, Terry is still disabled following his brain injury.
More after the jump
Over the course of his recovery, doctors used a new imaging technique called diffusion tensor imaging (DTI) to follow the way his brain rewired itself, and compared it to an MCS patient showing no signs of recovery. The findings are published in the Journal of Clinical Investigation, along with a commentary. Terry's accident severed a lot of neuronal connections between different brain areas. The growth of new axons reconnected these areas, but in doing so created pathways and structures never normally seen in brains.
I imagine that these findings are going to be seized upon by bioconservatives to support their position over the sad affair of Terri Schaivo, but the only similarities between these cases are the patients' first names. Terri Schiavo suffered from a persistent vegetative state (PVS), a different condition to MCS, where patients experience infrequent but real periods of conciousness. Terri's case was reviewed time and time again, and each diagnosis confirmed the PVS, as did her autopsy. Recovery from MCS remains rare, especially after more than 12 months. Terri Schiavo had lost most of her cerebral cortex, a much more severe injury than that of Terry Walker.
Another neuro-related news item caught my eye the other day, and it serves to reinforce my original point. Natives of the British Isles will be familiar with the Geordie accent – I'm not sure if I can think of any Geordies that might be well known to US readers, but suffice to say those who hail from Newcastle have a rather distinctive way of speaking. This was true for one Linda Walker, up until she she suffered a stroke at the age of 60. Following the stroke, Linda's Geordie lilt was gone, to be replaced by a Jamaican accent. Strange as it sounds, there have been around 50 cases recorded of foreign accent syndrome, which results from damage to the speech centers of the brain.
Well we can't say we saw this coming. Apple, who was expected to come out with a new Mac to target the education market, has done just that, but it's not what you think.
Apple has begun offering, through educational channels, a new low-priced Intel iMac. At US$899 the new iMac is priced US$400 cheaper than anything available to the general public and just US$100 more then the high-end Intel Mac mini. One might expect a horribly crippled machine at this price but the US$899 model actually compares fairly well to the US$1299 model. Both machines use the Intel 1.83 GHz Core Duo and sport a 17-inch widescreen LCD. The only major differences are that the EDU iMac comes standard with a 80GB drive (vs. 160GB), a 24x Combo Drive (vs. 8x Super Drive), and Intel GMA 950 integrated graphics with 64MB of shared memory (vs. ATI Radeon X1600 with 128 MB). The consumer model also sports built in Bluetooth and an Apple Remote which the new low-end EDU model lacks.
The new machine also comes standard with keyboard and mouse making this truly an affordable all-in-one solution for education, something the 'budget priced' Mac mini cannot offer. It should also be noted that this new iMac spells the long-awaited end for the crufty old eMac:
The 17-inch iMac for education is available immediately and will replace the eMac®, Apple’s last CRT based computer, providing students and teachers everything they need to learn and create in today's digital classroom, all in the ultra-efficient iMac design.
This new low-priced iMac is enough to encourage awkward glances in the general direction of my credit card; I have always admired the price point/value of iMacs and the slightly lower specs of this machine doesn't hinder that. This would be an ideal way for a lot of people to jump on board the Intel bandwagon. The question is: what does this mean for the consumer line? Speed bumps ahoy?
Paris has just announced an ambitious new push in its révolution numérique, the city’s plan to make itself into one of the world’s most wired capitals. At the moment, nothing says “wired” quite like “wireless,” so Paris plans on blanketing the city with a free WiFi network operated by private companies.
The socialist mayor of Paris (that’s not a perjorative statement; he’s actually a card-carrying member of the Socialist Party), Bertrand Delanoe, wants the system up and running by the end of next year. “We will act fast and firmly… to create the most favorable conditions for Paris,” he told reporters. “It is a decisive tool for international competition and thus important for the city.”
But free WiFi is hardly a “decisive tool for international competition.” After all, more than 60 percent of Parisian households already have high-speed ‘Net access and businesses aren’t likely to be excited by the prospect of trusting the company’s access to a sometimes-flaky wireless signal with all of its security woes. Some cities are also learning the hard way that reliable WiFi is easier dreamed up than implemented. Still, the system promises Internet access in public places like parks and libraries, and it’s hard to imagine anything better than reading Ars from a bench in the Jardin du Luxembourg on a fine spring day.
What’s more intriguing than the WiFi announcement is the second part of the plan, which is designed to ensure that 80 percent of Parisian addresses are wired with fiber by 2010. The ambitious goal will be aided by a government tax cut on companies that lay fiber over and through city-owned rights-of-way (think sewers). The resulting system should deliver super-fast ‘Net connections to citizens and businesses across the City of Lights.
The city also plans to open Espaces publics Numériques in many arrondissements that will allow people to use computers and take classes on computing and Internet technology. The goal is to make Paris one of the top digital cities on earth in the next decade, a move that could help the city stay competitive in the global labor market.
Paris isn’t alone in its ambitions. It faces competition from most major cities, including London, San Francisco, Chicago, and others, though most cities have so far only announced plans for WiFi. Paris’ aggressive fiber rollout plans could give it an edge, potentially making the “Socialist City” one of the best places in Europe to do high-tech business.
Sony’s E3 announcement of motion-sensitive capability for their PlayStation 3 controller took everyone by surprise. The biggest surprise, however, was that the developers of Warhawk had only two weeks to implement motion control into their demo. The idea that Sony would keep their own developers in the dark about such a crucial new feature caused many to wonder if the electronics giant was flailing around looking for a winning strategy.
Now, in an interview at IGN, Sony Santa Monica Studios Game Director Brian Upton has revealed a little more information. According to Upton, his development team, Incognito, has “secretly been working with Sony on the tilt technology for a while, but it wasn’t until the last few weeks before E3 that they received a working controller.”
The question that comes to this reporter’s mind is: how long is “a little while”? Is it before or after Nintendo announced their motion-sensitive controller in September of last year? And if a “little while” was that long, why did it take until a few weeks before E3 to get a working controller to developers?
As we have argued before, the answer to the second question may be found by taking a closer look at Sony’s legal battles with Immersion over their patent for “rumble” technology in game controllers. Other companies, such as Microsoft, settled with Immersion and continue to provide rumble functionality in their game consoles. Sony decided to play tough with the little company, but the company failed to defend itself against infringement charges and has already lost one of two pending appeals, making it seem likely that Sony will eventually have to open its pocketbook and write a substantial check to Immersion.
Sony’s official policy is that rumble was removed from the PS3 controller because it interferes with the motion sensors. This statement doesn’t stand up for a number of reasons. For one, Nintendo has demonstrated their motion-sensitive controllers that include rumble technology. Even if Sony couldn’t manage to make both work simultaneously, it would be easy enough (from an engineering standpoint, that is) to automatically turn the motion sensors off while the controller is rumbling. A more likely answer is that Sony was hoping to include rumble right up until the last minute, pending a successful appeal. However, having suffered a tremendous legal defeat instead, it appears as though the company has decided that Immersion’s involvement with the PlayStation brand is finished.
One important thing to remember is that although both the PS3 and the Nintendo Wii can claim “motion sensor ability” as a bullet point on their spec sheets, the two systems are really not very similar. The Wii features absolute position sensing via a control strip of three LEDs arranged below the television, whereas the PS3’s controller merely senses relative motion, primarily involving tilting over the three axes of motion. Games written specifically for the Wii’s controller will not be able to be ported to the PS3 without significant modification.
Universal Music Group, a division of Paris-based Vivendi Universal, has just announced plans to revamp the way they sell music CDs in European markets. Rather than serving up every product in the same 20-year-old CD case design, Universal will divide their offerings into three tiers with different packaging and different prices.
At the bottom end, a “basic” package will be introduced, consisting of a simple sleeve made of heavy cardstock. It should sell for €9.99, the same price Apple charges for a full album download from the European iTunes Music Store. The package sounds reminiscent of Neil Young’s Mirror Ball, among other cardbord packages offered up over the years, but it’s not a one-off specialty item. Universal hopes to ship some 100 million of these albums by year’s end, starting from the planned September introduction. The basic package is earmarked for older releases and other slow sellers in the traditional jewel case package, and will come with no liner notes or other extras. According to the Guardian, the no frills concept is a “acknowledgement” of the power of iTunes.
If extra features are what you want, you’ll be more interested in the premium “DeLuxe” packaging at MSRPs around €19.99. That version, aimed at collectors and gift-givers, will feature a beefed-up case and tons of bonus features, ranging from expanded notes and bonus tracks to DVDs filled with behind-the-scenes clips and live performances.
In-between these extremes comes the “Super Jewel Box,” a sturdier version of today’s brittle jewel cases, but otherwise much the same as anything you’d find at your local Wal-Mart today. It’s supposed to sell for €14.99, comparable to today’s average CD price, and will feature new releases and popular back-catalog items.
Universal claims to understand that digital downloads are the future, but then go on to defend the need to boost physical CD sales again. The company’s executive vice-president of international marketing and A&R, a fellow by the notable name Max Hole, says that downloading has renewed interest in older titles, and that all you need to do in order to sell those albums is lower the prices. “We can grow the CD market,” says Mr. Hole. “That might be a little optimistic, but we can certainly slow its decline.”
I can see some value in the new packaging options: premium bundles will always have a following, and my CD rack is littered with broken cases, so the Super Jewel thing sounds like a welcome upgrade. But while I enjoy Mirror Ball immensely and can appreciate the Earth-friendliness of paper packaging, I’m not convinced that pricing is the only problem to solve regarding slow back-catalog sales. Those albums just aren’t promoted, and there’s also a major convenience difference between downloads and plastic discs. Besides, if all you’re getting is a plain cardboard sleeve with cover art, and you just want to rip it to your iPod anyway, what’s your motivation for getting physical? There’s more research to do regarding the true drivers of the digital revolution, and whether it’s more about price, about convenience, or something else entirely.
I was surprised to find that an article on the dry topic of crop yields was surprisingly compelling. It seems to have it all: new information pulled together from a collection of studies that suggests the current consensus was based on an outdated technique. The topic is politically charged, and the editors even let the authors get away with a pun in the title ("Food for Thought"). How does this all fit together?
As atmospheric carbon levels go up and the planet warms up, crops are expected to be impacted in a variety of ways. Increased temperature and CO2 are expected to accelerate growth, but soil moisture will decrease, potentially counteracting these effects. Based on a number of studies, it was expected that these factors would largely balance out, with a slight decrease in crop yield possible. The new study addresses one of these factors: how CO2 affects growth and yield of crops. The authors note that the estimates of these effects being used in assessing the impact of rising carbon levels are based on experiments in enclosed buildings, where it's easy to control the atmosphere. But technology has since improved, and free-air concentration enrichment (FACE) technology allows crops to be grown in the open under controlled atmospheric conditions, more closely approximating real-world conditions.
What happens when you compare the results of FACE experiments with enclosed results? The enclosed experiments produce crop yields that are more than two-fold higher than those produced using FACE technology. This, in turn, suggests that the expectations for the future crop yields may be over-estimates. As the authors note, "This casts serious doubt on projections that rising CO2 will fully offset losses due to climate change." They do, however, wrap up on an optimistic note: crop plants have been selectively bred for a number of properties, and increased growth at higher CO2 levels may be as accessible to breeding as anything else. The same technology that allowed us to recognize the problem may be useful in breeding a correction for it.
Some Apple rumors, like the iTablet, never die, no matter how much time passes or how little sense they make. At the other end of the rumor spectrum are ideas so obvious, like the flash-based iPod Nano, that it is only a matter of time before they become product. There are also rumors that no one believes right up until they happen, like the switch to Intel, and there are the rumors that make sense but never seem to happen.
An Apple spreadsheet would fall into the last category.
Was it only a year ago that AppleInsider was touting "Numbers" as the next big software release from Apple Computer? Yes, it was.
Rumors that Apple Computer has been quietly developing its
own spreadsheet solution gained a dab of credibility this week as
sources pointed to a revealing company filing with the United States
Patent and Trademark Office. Just two days after requesting a trademark on the word 'Mactel,'
which seemingly describes the convergence of Macintosh design with
Intel hardware, Apple on June 8th filed for a standard character mark
on the word 'Numbers.'
It's a little over a year later and a new spreadsheet rumor is out,
except this time ThinkSecret is the messenger, the name is "Charts,"
and it's *confirmed* to be true.
Long rumored—or at least, assumed—to be in development,
sources say Apple is not planning on positioning Charts as a competitor
to Microsoft's Excel, but rather as a more consumer-friendly
spreadsheet application that can handle the needs of home users and
small businesses but not pretend to execute any of the more advanced
functions of Excel.
Presumably, Charts will allow the import of Excel spreadsheets—unless they have advanced functions it cannot pretend to execute. Beyond that, it's anybody's guess. Along with the obvious, ThinkSecret claims development includes nonsensical stuff like Address Book integration. Pricing, which currently is US$79 for iWork, is unknown.
So, what does Charts—if true—mean for Mac users? Not much.
To date, iWork has made no impression as an office suite. Of course, Apple stresses that it is not competing with Office for the Mac, but anyone who has used Pages could have told you that. The only way this could matter is if Apple did what should have been done three years ago when iWork first came out. Apple needs to make iWork free on new Macs, a true replacement for AppleWorks. That would be a rumor worth seeing come true.
At this point in time, if you were interested in trying out Windows Vista Beta 2, then you definitely have already. Though some of us don't realize it right away, Microsoft has two goals in mind when it releases a public beta: give the tech community a preview of the new operating system, and more importantly, gather massive amounts of bugs and feedback about the new OS.
Focusing on the second goal, customer feedback, one way Microsoft allows testers to report bugs is through the Microsoft Connect website. Besides reporting, testers are also allowed to view, validate, and search for issues. The other night, Robert McLaws of Longhorn Blogs was viewing and commenting on some bugs when he decided that he'd like to see a statistical analysis of the some 28,700 Vista bugs in the system. Over four and a half hours, McLaws manually added every bug posted on Connect to an Excel 2007 spreadsheet, and he discovered some fantastic trends.
Once McLaws removed 1,072 duplicate items, he started exhuming statistics from the raw data. Some of his intriguing findings include:
An average of 81 bugs per day are reported for VistaThe count of bugs per day is increasing, not decreasingAround 200 bugs are reported within the first 24 hours of a new releaseOver 20,000 bugs have been closed so far where closed holds a status of "Closed" or "Resolved"When Microsoft released the Start Orb, 353 bugs were added to Connect during the first day and 338 during the secondOnly 1/5 of the total bugs submitted are still open
One item above that is worth discussing is the increase in the number of bugs per day. While one could conclude that an increase in bugs per day is a Bad Thing, McLaws notes that more and more people are using the system now than ever before, especially with the public beta and all. As a software engineer myself, I agree with McLaws; the more hands on a system, the more problems will be uncovered. Once the company releases RC1 to the public, we could expect another drastic spike in bug count, although the items should be more in-depth than and not as obvious as bugs reported during Beta 2.
Overall, Microsoft still has a ways to go before it has an operating system that can be sold and proudly delivered, but we knew that even without pretty graphics and some statistical breakdowns. Just like every piece of software that has ever shipped, Vista will ship with some bugs intact. Let's just hope Microsoft knocks out the big boys and doesn't leave them for a service pack instead.
Originally designed for the GNU Image Manipulation Program, the GTK graphical application development toolkit provides an extensive assortment of widgets and controls for cross-platform software construction. The latest version, GTK 2.10.0, has been officially released. With plenty of exciting new features for users and developers, GTK 2.10.0 is a significant improvement over previous versions. The popular, cross-platform toolkit is widely used on a variety of platforms, and provides the foundation for the GNOME desktop environment. Available under the GNU’s highly permissive LGPL license, GTK has been adopted by numerous proprietary and open source software developers.
After receiving a steady litany of complaints about the absence of a visible file textbox in the GTK file chooser dialog, the GTK developers have finally relented and integrated a location entry. GTK 2.10 also includes long-awaited support for drag-and-drop tab reordering, a feature that has been independently implemented in virtually every major GTK application including the GNOME terminal, Gaim, Firefox, and Gedit. Inclusion of tab reordering in GTK will eliminate the need for a lot of redundant code, and it will ensure that tab reordering looks and feels consistent in all GTK applications. GTK 2.10 also includes many improvements to printing functionality, including a new cross-platform compatible, high-level printing API that will simplify a few of the challenges associated with maintaining portable GTK applications.
The theme system has received massive improvements in GTK 2.10. GTK now allows theme developers to use symbolic colors, a feature which could finally facilitate utilization of multiple color schemes with a single theme. This highly desirable feature will enable users to customize the colors used by GTK themes without having to alter the theme itself. New style properties have been added to a number of widgets, including tabs, menus, trees, and buttons.
GTK 2.10 includes several improvements to GDK, the portable drawing toolkit used by GTK, including an experimental, native OS X GDK backend that will eventually make it possible for GTK apps to run on OS X without X11. An experimental framebuffer GDK backend is also available in this release. A new function has been added that will enable GTK applications to detect the presence of a compositing manager like XGL, possibly a prelude to more extensive integration of translucency in various GNOME applications, like real transparency in the GNOME terminal.
I use GTK and the GNOME libraries for many of my own development projects, particularly for simple utilities. The Ruby and Python bindings for GTK are great, and extremely useful for rapid application development. Released earlier this week, the latest version of the Ruby GNOME bindings include support for the poppler PDF rendering library, and the VTE library used by the GNOME terminal. Look forward to GTK 2.10 in the upcoming GNOME 2.16 release!