The Tao of Mac Weblog has put into words what I'm guessing sums up the feelings of many .Mac users. Rui's main crux is that US$99 is way too much to pay for services that haven't really been updated to meet modern expectations. Google's Gmail is mostly to blame, however many of .Mac's services and their pricing seem to be vastly out of line with what you can get by using 3rd party products. Applications like JungleDisk (works with Amazon's S3 service) is vastly cheaper for online disk space, and the aforementioned Gmail or Yahoo! Mail fills the bill for web mail.
On the topic of mail, Gmail is obviously free and actually does a decent job of spam filtering on its own, whereas Apple's .Mac mail service is limited to whatever free space you have on your iDisk account and has woefully inadequate spam filtering when put up against Gmail's service. Apple relies heavily on Mail.app's spam filtering capabilities, which can be hit or miss. You lose all of that "built-in" functionality when you eventually come to the realization that you've outgrown the client. Sure you can use SPAM filtering in your client of choice, but that's not always an option. For example, I often check my mail accounts via my cell phone and other portable devices that don't have the horsepower to filter every message. Server-side filtering is the way to go, and as it stands, I'm pretty sure that a large percentage of Apple's .Mac users stuck around for the email address/service alone.
Rui gets down on the other parts of .Mac: syncing, web albums, iChat, and the notoriously slow iDisk. He also notes a few examples of what Apple could do to sway many users to their service: IMAP IDLE support, Moblogging support, over the air iSync, and contact and calendar syncing.
One thing to be kept in mind is that while these kinds of features and beefing up .Mac in general will sway some of us nerds over to .Mac, for many users it's a handy service that integrates with many of their applications and lets them do a lot of things that they would otherwise have no idea how to accomplish. For people with limited computer skills, .Mac is worth US$99 per year and more. However, there is always something to be said for attracting "power users" and other "influencers" to a service. You want the people with the biggest mouths and the most reach to speak highly of your product, not dog it at every opportunity.
With all of that said, I'm almost certain that Apple has to be working on some sort of functionality upgrade to the service. In the face of all these new competitors, and without a significant update in features or pricing in what seems like forever, I'm betting that Apple has something exciting up its sleeve in the near future.
We've reported on Steam keeping track of statistics about how people are playing Half-Life 2 Episode 1, but this one is a thinker. Valve has released an update to the game, and it seems as if the biggest point was to make the game easier during the elevator scene. Remember, the game already has multiple levels of difficulty, but based on their numbers, this section was too hard for many players. Enough to change the game for everyone?
I loved the elevator portion of the game—it got my blood pumping. While I died more than once, when I was finally able to beat that section I got a nice sense of accomplishment. There's nothing worse than being stuck in one area in a game, but of course there's nothing better than finally getting past it and continuing on. You can't have one without the other. While I like the fact that Valve is trying to keep their customers happy, couldn't they have added this only to the lower difficulty levels and allowed those of us who like a challenge to enjoy the game as it was originally designed?
It's also worth nothing that this update is applied when you start Steam up, whether you want it or not. Outside of unhooking your internet connecton whenever you start Steam, there's nothing you can do to keep this out of your system. The idea of patches that are near mandatory makes me more than a little nervous, and it's a side of digital distribution we haven't really talked about. I know a lot of people who play a lot of games that they keep at a very specific version number, simply because they dislike what certain patches do to the game. If we lose that ability and a bum patch or update is released, you're going to have some grumpy people playing your game. Especially if you like a higher level of difficulty.
Science, and the pursuit of science, is often held up as a paragon of integrity. Noble researchers seek out empirical truths regarding the world around us, reporting their results and sometimes challenging dogma. Therefore, when reports or accusations of scientific misconduct of fraud occur, they sting all the more. Everyone expects a politician to be dishonest, but the men in white lab coats are held to a higher standard.
A notable recent example concerns the work on human embryonic stem cells conducted in South Korea under Dr Woo Suk Hwang. Readers may remember the excitement that surrounded the announcement, and then the furore following news that results were faked or enhanced, and that the postdocs working in the lab were pressured into donating their own eggs to the project.
Lest you think it only happens in South Korea, the past couple of years have seen other examples of scientific misconduct, and as a result, journals are beginning to take the issue even more seriously. This brings me to an article in last week's Science. As I've covered before, postdocs make up the backbone of most research labs in the US. Postdocs depend on good publications in the fight for the few faculty appointments, and often at the mercy of their PIs due to the fact that postdoctoral positions fall somewhere between jobs and studentships when it comes to employment law. More and more postdocs in the US are reporting that they too are subject to the same kinds of pressures experienced by those working under Dr Hwang. Worse yet, those postdocs that try to speak out about the problem find that being the lowest rung on the ladder means that if it's a choice between them and the high-flying tenured PI, there's no contest.
Stories abound about labs where studies start with preconceived endpoints, where data that 'doesn't fit' is left out, or worst of all, data that is knowingly false is published anyway. It is almost understandable. Funding is becoming ever harder to obtain, and there is constant pressure from University administrators to bring in more grant money and publish in prestigious journals. Almost, but not quite. If studies are approached with a fixed conclusion at the outset, how does that differ from the proponents of ID? And the more such scandals occur, the closer the public comes to rank the men in white lab coats alongside baby-kissing politicians.
Please don't get the impression that labs such as these are the norm; data-fudging pressure cookers are the exception to the rule, but when they coincide with high-profile labs, it has more of an impact. In the decade that I've been working in research, I may have heard stories in the pub regarding other labs, but can't think of a time when anyone has told me to pretend a result never happened, nor have I seen it happen to anyone else.
VoIP service provider Vonage has acquired three patents from IP license aggregator Digital Packet Licensing LLC. The patents (4,782,485, 5,018,136, 5,444,707) deal with various aspects of data compression for VoIP traffic, and are already the basis for lawsuits against Sprint, Verizon, AT&T, and Nortel, among others. In the press release about this acquisition, Vonage president Michael Tribolet says that “The acquisition of these patents is part of Vonage’s strategic plan to further develop our core technology and build on our innovative, affordable and feature-rich phone service. Vonage’s strategic plan also includes a concerted effort to grow, create and acquire other significant intellectual property portfolios.”
So far, this sounds an awful lot like patent trolling, doesn’t it? Buy a portfolio of patents with existing licensing deals and a couple of infringement lawsuits already in motion, then stand back and reap the rewards. But the situation isn’t quite that simple.
In other news yesterday, Vonage was itself slapped with a fresh patent infringement lawsuit, this time by a tiny tech outfit by the name of Klausner Technologies. Fresh off a successful lawsuit against AOL, based on the same digital voicemail patent (5,572,576), Klausner is asking Vonage for US$180 million in damages and royalties. Vonage is also already the target of a lawsuit from Verizon, where the telco claims infringement of seven different patents.
Where there’s smoke, there’s fire; maybe the Digital Packet patents were bought as insurance and defense against the incoming stream of suits, rather than aggressive tools for seeking easy revenues from lawsuit settlements or judgements. You be the judge.
I’ve been a fan and user of Vonage’s phone service for years, so there’s nothing wrong with the service in my opinion. The business side of Vonage is a different story. The company is hemorrhaging cash and may never be profitable. That’s not conjecture, but clearly stated in the company’s prospectus for its IPO filing. Vonage looks like a train wreck in progress, and these patent issues may speed or slow the progress of that, but won’t stop it either way. I’m afraid I’ll have to change my phone provider soon, maybe in the next two or three years, but not because I’m unhappy with my current one. It’s because I think it may go out of business.
Oracle CEO Larry Ellison has stated that the database company could soon be providing Linux support services to Red Hat customers. During an interview several months ago, Ellison announced plans to enter the Linux distribution business, potentially by acquiring Novell. Citing distribution compatibility concerns, Ellison claimed that Oracle would be better off with its own complete middleware stack rather than trying to support Oracle database and middleware software on a rapidly growing number of Linux distributions. Now it appears Ellison is leaning towards simply appropriating what Red Hat produces and building a support services business on top of it rather than buying up a distributor:
“Red Hat is too small and does not do a very good job of supporting [its customers]. … The great thing about open source, the most interesting thing to me is the intellectual property. … We can just take Red Hat?s intellectual property and make it ours, they just don?t have it.”
Ellison’s attitude isn’t all that surprising, since he made similar statements about JBoss following Red Hat’s acquisition of the Java middleware company in April:
“Why didn?t we buy JBoss? Because we don?t have to – if it ever got good enough we?d just take the intellectual property – just like Apache – embed it in our fusion middleware suite, and we?re done.”
Although Oracle’s approach may seem exploitative, it is important to keep in mind that the absence of ownership in open source software is intentional. In the open source software industry, companies contribute to a shared intellectual property commons and compete with each other on the basis of service and support quality.
Red Hat and Oracle previously enjoyed a strong, mutually beneficial relationship, but in the wake of the JBoss acquisition, Red Hat is now in the middleware market, where it competes directly with Oracle. By offering support services to Red Hat customers, Ellison hopes to limit Red Hat’s growth into Oracle’s territory and make some money at the same time.
Although existing Oracle customers will probably be interested in streamlining their support consumption and working with one vendor rather than two, it is doubtful that Oracle will be able to meet the needs of Red Hat users better than Red Hat. Red Hat certainly needs to be more responsive to certain kinds of support issues, but that doesn’t imply that a company with more resources and expertise will be more successful attempting to supply the same services externally. Ellison seems to think that by leveraging its superior resources, Oracle can beat Red Hat at its own game. I think that Ellison suffers from some misconceptions about the nature of the open source software development process, and fails to recognize that such a business model would make Oracle dependent on Red Hat in many respects. In order to support Red Hat customers, Oracle would have to work closely with Red Hat, and Oracle’s aggressive attitude really doesn’t give Red Hat any incentive to be accommodating. It seems to me that if Oracle decides to make Linux support a serious part of its business, it will have to create its own distribution or a derivative.
Today we looked at the man behind the first joystick, and from that launching point it's a good time to talk about everything that came thereafter. While most of us are familiar with gamepads starting with the Atari or NES and going from there, it's hard to wrap your mind around all the gaming controllers that have been released since. From the gimmicky to the surreal, if you can dream it, someone has done it. Some of the more popular modern games are played with dance mats or guitar controllers, but who really wants to control a game with a big finger that you stab into an arcade cabinet with a model of someone's behind? I'm convinced that game was only invented so we have an easy joke when we talk about game controllers.
1up.com has a look at some of the more "out there" controllers. Some I've used, some I've avoided, and others I've never heard of. A lot of these things just didn't work; ideas always move faster than technology, but unfortunately that rarely keeps companies from selling products anyway. I'm not going to be quite as hard on these guys as 1up was, I think it's important to note that it takes a lot of failures before we get something like a Guitar Hero controller or a DDR Pad. Of course, the Guitar Hero controller is a ripoff of the Guitar Freaks controller, and the DDR Pad is really just a Power Pad with different colors, so maybe there are just a few ideas out there and we keep repeating them.
I do think it's odd that they completely ignore rhythm games, since that's a genre that's always begging for custom controllers. From the Donkey Konga Bongos to the Beatmania controller to the aforementioned Dance Dance Revolution pads, most rhythm games have their own controller. Luckily most of these games are solid and avoid the gimmickry associated with the sillier controllers, so they're not as fun to read about. This article really makes me want to track me down some Samba De Amigo controllers.
To an extent, there are two schools of thought about how to go about screening for a new drug. One could be called the "Intelligent Design" method: choose a target, study its structure and biochemistry, and then design a drug that specifically interferes with it. In this case, much of the work goes into understanding the system. The second, called "High Throughput Screening," takes a more evolutionary approach. All you need in this case is a test tube version of the process you want to affect. You search for drugs simply by dumping every chemical you can get your hands on into the test tube and looking for those that inhibit the process. Once an effective compound is identified, variations of it are synthesized and tested in turn.
Although less targeted, high throughput screening has some advantages. You don't need a complete understanding of the process you're trying to alter in order to test compounds, meaning it will work with complex and poorly understood disease targets. Malaria ranks pretty high up on the scale of complex and poorly understood diseases, and a reader was kind enough to point me to an article that describes a screen of compounds for the ability to inhibit the growth of drug resistant strains of the malarial parasite. The technique also took advantage of a different aspect of high throughput screening: the majority of the drugs they tested have already been approved by the FDA, meaning they've passed rigorous safety screening, and would be relatively inexpensive to bring to market for a new use.
Nearly 200 drugs were identified that reduced parasite cell division by over 50 percent. The authors focused in on an antihistamine called astemizole, which can be ingested orally. Testing in mice showed that it knocked down the load of the malarial parasite by about 80 percent. The drug itself is no longer on the market in most of the developed world, since alternatives with fewer side effects are available. As the patent on the compound has expired, however, it is being sold in over-the-counter form in the developing world. The authors also note that, as a successful drug, hundreds of variants on it have already been synthesized, and may be worth testing for enhanced anti-malarial activity.
The work was done at Johns Hopkins, and there's an interesting coda in the final paragraph that shows that this promising work may not be the last high throughput screen to use this panel of drugs:
Currently, the JHCCL is undergoing expansion to include every available drug ever used in the clinic via phase 2 clinical trials or approval by the FDA or its foreign counterparts. When complete, the JHCCL will be available to any researcher interested in screening for existing drugs that may be useful as economically viable new therapies for diseases of the developing world.
Hopefully, this won't be the last time we report on screens using this panel of drugs.
Nothing says “we’re not a faceless corporation” like a corporate blog—unless that blog is launched by Dell and features product announcements and tours of the “Enterprise Command Center.”
After opening its new blog to the public last week (Name: one2one, Tag: “Direct conversations with Dell”), it didn’t take long for Dell to come in for a blogosphere tongue-lashing. Jeff Jarvis complained that “Dell isn’t listening. And listening, once more, is the first step in blogging.” Steve Rubel made the same critique, telling Dell to “Join us. Be real. Walk the talk.”
It didn’t take long for the complaints to get Dell’s attention. One2one’s newest post, put up only this morning, is headed “Real People are Here and We’re Listening.” To prove it, Lionel Menchaca, Digital Media Manager at the company, went on to provide links to the blog’s critics and said that Dell really, truly, actually wants to join the conversation. “We’re excited to be here,” Menchaca said, “and we welcome your ideas.”
Dell’s week-old experiment in corporate blogging illustrates the difficulties faced by companies who make the decision to engage in a public discussion of their products and their problems. There’s obviously a fine line to walk here between being open to talk about corporate weaknesses and driving away potential business, but one2one shows that the blogosphere has no time for corporations who simply want to use a blog as another PR outlet. To its credit, Dell seems to want more than this for the new site.
The question is whether a corporate blog can ever be more than a marketing site. At some level, such blogs unavoidably become marketing tools—but that’s not necessarily a bad thing. If a company tries to make itself look better by listening to and interacting with customers, that’s the kind of marketing and PR push that we in the Orbiting HQ would like to see more often.
It usually works best when not pitched as an “official corporate blog,” and Microsoft has done a decent job of this with their MSDN blogs, which actively solicit developer feedback. Well-crafted blogs can humanize an organization, but they can also provide valuable, direct feedback from customers to developers and engineers. If done right, corporate blogs can help both the customers and the company. When treated as a traditional PR vehicle, nobody wins.
It was almost two years ago to the day that we reported on Internet Explorer’s first-ever drop in browser market share. At the time, IE usage had dropped from 94.8 percent at the beginning of 2004 to 93.9 percent a few months later. As Firefox approached the big 1.0 milestone, its market share continued to soar, and it passed the 10 percent barrier in October 2005.
Web analytics firm OneStat.com is now reporting that Firefox has grabbed an almost 13 percent market share worldwide, while IE has dropped to just over 83 percent. Firefox’s current 12.93 percent market share is up from 11.51 percent in November 2005, while Internet Explorer is down almost 2.5 percentage points. In the US, IE has dipped below the 80 percent mark, down to 79.78 percent, while Firefox has 15.82 percent of the market.
OneStat.com measures browser usage by looking at the traffic at its clients’ web sites. The figures from sites using the company’s commercial traffic analysis package are combined to come up with numbers that represent the average number of visits from a particular browser. According to TheCounter.com, while IE 5 and 6 combine for 84 percent market share, Firefox has just 10 percent and Safari 2 percent. That’s a significant gain for Firefox, which had just 6 percent at the beginning of the year by TheCounter.com’s stats.
Here at Ars, the picture is a bit different. A quick glance at our stats shows that Firefox is the most popular browser with our readers, with 41.92 percent. Internet Explorer accounts for 29.1 percent with Safari at 9.9 percent and Opera at 2.45 percent.
Firefox is even stronger in other parts of the world. In particular, the browser accounts for 39.02 percent of all web traffic tracked by OneStat.com in Germany, with IE sitting at 55.99 percent. Firefox has also broken the 20 percent barrier in Italy and Australia.
With new browsers in the offing from both Microsoft and the Mozilla Foundation, those numbers may be shaken up a bit once Internet Explorer 7 and Firefox 2.0 ship (beta candidate 1 of Firefox 2.0 was released yesterday) later this year. Of course, Internet Explorer is a more radical change from its predecessor, while Firefox 2.0 is a more evolutionary revision. Whether IE 7’s new features and improved security settings will be enough to stop the defections to alternatives remains to be seen.
Movie studios are showing themselves increasingly willing to put their films up for sale and rental on the Internet—and not just through sites that they own or control. The most recent example comes from Sony Pictures Home Entertainment, which has just inked a deal with online video distributor GUBA. GUBA becomes the first “video sharing community” to get access to the Sony catalog of films, but don’t think that “sharing” means “free.”
GUBA plans to charge 20 bucks to download new features and a ten spot for films from the back catalog. Though the service initially has only 100 Sony films, this will be expanded to 500 within the next year. The films are protected by Microsoft DRM (sorry, Mac users), and they’re only viewable on a Windows computer (or an HTPC hooked up to a television). As is usual with this type of setup, no DVD burns will be allowed.
In some ways, GUBA is an odd choice for a Sony partner. Much of the site is a YouTube-style assortment of zany videos, which means that you can have a link to a man who can touch his eye with his tongue on the same page as the link to Underworld: Evolution. Such pairings can make the site feel a bit schizophrenic, but GUBA has done a good job of making it simple to look for either free or premium content.
GUBA has made quite a name for themselves the last few months. In addition to scoring the recent Sony deal, the site also announced a partnership with Time Warner in June. Warner, like Sony, has shown a willingness to experiment when it comes to Internet distribution, though they’ve been doing it longer than Sony has.
Warner already has a deal in place with one-time pariah BitTorrent. The plan to offer DRMed movies to users through BitTorrent’s efficient distribution system is a telling admission of the legal uses of peer-to-peer technology, though studio insistence upon strict DRM controls and a lack of DVD-burning options make the service no more attractibe than GUBA.
Warner has also been active in Europe, partnering with another peer-to-peer company there to offer movie downloads. Such moves are excellent news for consumers, but not for the reason you might expect. What’s exciting about the recent announcements is that they show the movie studios have learned their lesson from the music business and are determined to provide good legal alternatives to piracy right from the start.
Unfortunately, the actual services that have been rolled out are underwhelming unless you own an HTPC. Even then, they aren’t a great deal when you consider that picking up the DVD costs about the same price and offers more flexibility and portability. When movie studios finally discover the magic combination of price and DRM that makes their product compelling to consumers, online distribution could become a lucrative alternative to traditional retail. That day has not yet arrived.