According to documents filed by Creative with the United States International Trade Commission in mid-May, Steve Jobs approached a Creative employee at MacWorld 2001 to talk shop about Creative’s then flagship product, the Nomad Jukebox. What begun at this innocuous meeting may well have led to the creation of a digital audio empire that could have rivaled what Apple has put together on its own. On the other hand, it may have also led to stunning failure for both parties.
In the past few months, one of Apple’s biggest rivals in the digital audio player space, Creative, has begun competing with Apple using legal tactics. Looking back before the birth of the iPod, Creative had a significant lead on Apple, with almost a full year’s experience in developing and selling hard-disk-based players. What is almost completely unknown, however, is that Apple was willing to work with Creative to develop an Apple-licensed player as opposed to building their own from scratch.
According to Creative’s legal filings (available via PACER) with the US International Trade Commission, Apple had originally sought to license Creative’s IP and create the iPod on Creative’s platform. The filings reveal that Apple was unsure that this scheme was going to be profitable in the long run, and proposed the radical (for Apple) idea of partnering with Creative to create a digital audio player. Creative decided against joining forces, and the rest is history.
While one might be quick to assume that a merger between the two companies would have led to a marriage punctuated with hundred dollar bills falling from the mirrored ceiling of their honeymoon suite, it may also be that a mating of these two wildly different corporations would have been destined for failure. Apple’s reckless abandon coupled with Creative’s bulky hardware and stodgy corporate culture would likely have resulted in clashes of personality, among other things.
As it stands, Apple scored big when Creative turned down their proposal, freeing them up to engineer the iPod as they had envisioned it, rather than working off of one of Creative’s prototyped designs—or using Creative’s patented IP, as the lawsuits allege.
Creative and Apple’s relations were dotted by pithy remarks and under-the-belt jabs until this lawsuit business reared its head. Apple was able to take a radically different approach to marketing, growing their market while making a lot of money. As a result, the iPod has become synonymous with the portable digital audio player market. Today, Creative and Apple stand with patent lawsuits loaded, waiting for the other to flinch first—unless they can come to a settlement themselves.
VoIP service provider Vonage has acquired three patents from IP license aggregator Digital Packet Licensing LLC. The patents (4,782,485, 5,018,136, 5,444,707) deal with various aspects of data compression for VoIP traffic, and are already the basis for lawsuits against Sprint, Verizon, AT&T, and Nortel, among others. In the press release about this acquisition, Vonage president Michael Tribolet says that “The acquisition of these patents is part of Vonage’s strategic plan to further develop our core technology and build on our innovative, affordable and feature-rich phone service. Vonage’s strategic plan also includes a concerted effort to grow, create and acquire other significant intellectual property portfolios.”
So far, this sounds an awful lot like patent trolling, doesn’t it? Buy a portfolio of patents with existing licensing deals and a couple of infringement lawsuits already in motion, then stand back and reap the rewards. But the situation isn’t quite that simple.
In other news yesterday, Vonage was itself slapped with a fresh patent infringement lawsuit, this time by a tiny tech outfit by the name of Klausner Technologies. Fresh off a successful lawsuit against AOL, based on the same digital voicemail patent (5,572,576), Klausner is asking Vonage for US$180 million in damages and royalties. Vonage is also already the target of a lawsuit from Verizon, where the telco claims infringement of seven different patents.
Where there’s smoke, there’s fire; maybe the Digital Packet patents were bought as insurance and defense against the incoming stream of suits, rather than aggressive tools for seeking easy revenues from lawsuit settlements or judgements. You be the judge.
I’ve been a fan and user of Vonage’s phone service for years, so there’s nothing wrong with the service in my opinion. The business side of Vonage is a different story. The company is hemorrhaging cash and may never be profitable. That’s not conjecture, but clearly stated in the company’s prospectus for its IPO filing. Vonage looks like a train wreck in progress, and these patent issues may speed or slow the progress of that, but won’t stop it either way. I’m afraid I’ll have to change my phone provider soon, maybe in the next two or three years, but not because I’m unhappy with my current one. It’s because I think it may go out of business.
Science, and the pursuit of science, is often held up as a paragon of integrity. Noble researchers seek out empirical truths regarding the world around us, reporting their results and sometimes challenging dogma. Therefore, when reports or accusations of scientific misconduct of fraud occur, they sting all the more. Everyone expects a politician to be dishonest, but the men in white lab coats are held to a higher standard.
A notable recent example concerns the work on human embryonic stem cells conducted in South Korea under Dr Woo Suk Hwang. Readers may remember the excitement that surrounded the announcement, and then the furore following news that results were faked or enhanced, and that the postdocs working in the lab were pressured into donating their own eggs to the project.
Lest you think it only happens in South Korea, the past couple of years have seen other examples of scientific misconduct, and as a result, journals are beginning to take the issue even more seriously. This brings me to an article in last week's Science. As I've covered before, postdocs make up the backbone of most research labs in the US. Postdocs depend on good publications in the fight for the few faculty appointments, and often at the mercy of their PIs due to the fact that postdoctoral positions fall somewhere between jobs and studentships when it comes to employment law. More and more postdocs in the US are reporting that they too are subject to the same kinds of pressures experienced by those working under Dr Hwang. Worse yet, those postdocs that try to speak out about the problem find that being the lowest rung on the ladder means that if it's a choice between them and the high-flying tenured PI, there's no contest.
Stories abound about labs where studies start with preconceived endpoints, where data that 'doesn't fit' is left out, or worst of all, data that is knowingly false is published anyway. It is almost understandable. Funding is becoming ever harder to obtain, and there is constant pressure from University administrators to bring in more grant money and publish in prestigious journals. Almost, but not quite. If studies are approached with a fixed conclusion at the outset, how does that differ from the proponents of ID? And the more such scandals occur, the closer the public comes to rank the men in white lab coats alongside baby-kissing politicians.
Please don't get the impression that labs such as these are the norm; data-fudging pressure cookers are the exception to the rule, but when they coincide with high-profile labs, it has more of an impact. In the decade that I've been working in research, I may have heard stories in the pub regarding other labs, but can't think of a time when anyone has told me to pretend a result never happened, nor have I seen it happen to anyone else.
We've reported on Steam keeping track of statistics about how people are playing Half-Life 2 Episode 1, but this one is a thinker. Valve has released an update to the game, and it seems as if the biggest point was to make the game easier during the elevator scene. Remember, the game already has multiple levels of difficulty, but based on their numbers, this section was too hard for many players. Enough to change the game for everyone?
I loved the elevator portion of the game—it got my blood pumping. While I died more than once, when I was finally able to beat that section I got a nice sense of accomplishment. There's nothing worse than being stuck in one area in a game, but of course there's nothing better than finally getting past it and continuing on. You can't have one without the other. While I like the fact that Valve is trying to keep their customers happy, couldn't they have added this only to the lower difficulty levels and allowed those of us who like a challenge to enjoy the game as it was originally designed?
It's also worth nothing that this update is applied when you start Steam up, whether you want it or not. Outside of unhooking your internet connecton whenever you start Steam, there's nothing you can do to keep this out of your system. The idea of patches that are near mandatory makes me more than a little nervous, and it's a side of digital distribution we haven't really talked about. I know a lot of people who play a lot of games that they keep at a very specific version number, simply because they dislike what certain patches do to the game. If we lose that ability and a bum patch or update is released, you're going to have some grumpy people playing your game. Especially if you like a higher level of difficulty.
To an extent, there are two schools of thought about how to go about screening for a new drug. One could be called the "Intelligent Design" method: choose a target, study its structure and biochemistry, and then design a drug that specifically interferes with it. In this case, much of the work goes into understanding the system. The second, called "High Throughput Screening," takes a more evolutionary approach. All you need in this case is a test tube version of the process you want to affect. You search for drugs simply by dumping every chemical you can get your hands on into the test tube and looking for those that inhibit the process. Once an effective compound is identified, variations of it are synthesized and tested in turn.
Although less targeted, high throughput screening has some advantages. You don't need a complete understanding of the process you're trying to alter in order to test compounds, meaning it will work with complex and poorly understood disease targets. Malaria ranks pretty high up on the scale of complex and poorly understood diseases, and a reader was kind enough to point me to an article that describes a screen of compounds for the ability to inhibit the growth of drug resistant strains of the malarial parasite. The technique also took advantage of a different aspect of high throughput screening: the majority of the drugs they tested have already been approved by the FDA, meaning they've passed rigorous safety screening, and would be relatively inexpensive to bring to market for a new use.
Nearly 200 drugs were identified that reduced parasite cell division by over 50 percent. The authors focused in on an antihistamine called astemizole, which can be ingested orally. Testing in mice showed that it knocked down the load of the malarial parasite by about 80 percent. The drug itself is no longer on the market in most of the developed world, since alternatives with fewer side effects are available. As the patent on the compound has expired, however, it is being sold in over-the-counter form in the developing world. The authors also note that, as a successful drug, hundreds of variants on it have already been synthesized, and may be worth testing for enhanced anti-malarial activity.
The work was done at Johns Hopkins, and there's an interesting coda in the final paragraph that shows that this promising work may not be the last high throughput screen to use this panel of drugs:
Currently, the JHCCL is undergoing expansion to include every available drug ever used in the clinic via phase 2 clinical trials or approval by the FDA or its foreign counterparts. When complete, the JHCCL will be available to any researcher interested in screening for existing drugs that may be useful as economically viable new therapies for diseases of the developing world.
Hopefully, this won't be the last time we report on screens using this panel of drugs.
Today we looked at the man behind the first joystick, and from that launching point it's a good time to talk about everything that came thereafter. While most of us are familiar with gamepads starting with the Atari or NES and going from there, it's hard to wrap your mind around all the gaming controllers that have been released since. From the gimmicky to the surreal, if you can dream it, someone has done it. Some of the more popular modern games are played with dance mats or guitar controllers, but who really wants to control a game with a big finger that you stab into an arcade cabinet with a model of someone's behind? I'm convinced that game was only invented so we have an easy joke when we talk about game controllers.
1up.com has a look at some of the more "out there" controllers. Some I've used, some I've avoided, and others I've never heard of. A lot of these things just didn't work; ideas always move faster than technology, but unfortunately that rarely keeps companies from selling products anyway. I'm not going to be quite as hard on these guys as 1up was, I think it's important to note that it takes a lot of failures before we get something like a Guitar Hero controller or a DDR Pad. Of course, the Guitar Hero controller is a ripoff of the Guitar Freaks controller, and the DDR Pad is really just a Power Pad with different colors, so maybe there are just a few ideas out there and we keep repeating them.
I do think it's odd that they completely ignore rhythm games, since that's a genre that's always begging for custom controllers. From the Donkey Konga Bongos to the Beatmania controller to the aforementioned Dance Dance Revolution pads, most rhythm games have their own controller. Luckily most of these games are solid and avoid the gimmickry associated with the sillier controllers, so they're not as fun to read about. This article really makes me want to track me down some Samba De Amigo controllers.
Oracle CEO Larry Ellison has stated that the database company could soon be providing Linux support services to Red Hat customers. During an interview several months ago, Ellison announced plans to enter the Linux distribution business, potentially by acquiring Novell. Citing distribution compatibility concerns, Ellison claimed that Oracle would be better off with its own complete middleware stack rather than trying to support Oracle database and middleware software on a rapidly growing number of Linux distributions. Now it appears Ellison is leaning towards simply appropriating what Red Hat produces and building a support services business on top of it rather than buying up a distributor:
“Red Hat is too small and does not do a very good job of supporting [its customers]. … The great thing about open source, the most interesting thing to me is the intellectual property. … We can just take Red Hat?s intellectual property and make it ours, they just don?t have it.”
Ellison’s attitude isn’t all that surprising, since he made similar statements about JBoss following Red Hat’s acquisition of the Java middleware company in April:
“Why didn?t we buy JBoss? Because we don?t have to – if it ever got good enough we?d just take the intellectual property – just like Apache – embed it in our fusion middleware suite, and we?re done.”
Although Oracle’s approach may seem exploitative, it is important to keep in mind that the absence of ownership in open source software is intentional. In the open source software industry, companies contribute to a shared intellectual property commons and compete with each other on the basis of service and support quality.
Red Hat and Oracle previously enjoyed a strong, mutually beneficial relationship, but in the wake of the JBoss acquisition, Red Hat is now in the middleware market, where it competes directly with Oracle. By offering support services to Red Hat customers, Ellison hopes to limit Red Hat’s growth into Oracle’s territory and make some money at the same time.
Although existing Oracle customers will probably be interested in streamlining their support consumption and working with one vendor rather than two, it is doubtful that Oracle will be able to meet the needs of Red Hat users better than Red Hat. Red Hat certainly needs to be more responsive to certain kinds of support issues, but that doesn’t imply that a company with more resources and expertise will be more successful attempting to supply the same services externally. Ellison seems to think that by leveraging its superior resources, Oracle can beat Red Hat at its own game. I think that Ellison suffers from some misconceptions about the nature of the open source software development process, and fails to recognize that such a business model would make Oracle dependent on Red Hat in many respects. In order to support Red Hat customers, Oracle would have to work closely with Red Hat, and Oracle’s aggressive attitude really doesn’t give Red Hat any incentive to be accommodating. It seems to me that if Oracle decides to make Linux support a serious part of its business, it will have to create its own distribution or a derivative.
It was almost two years ago to the day that we reported on Internet Explorer’s first-ever drop in browser market share. At the time, IE usage had dropped from 94.8 percent at the beginning of 2004 to 93.9 percent a few months later. As Firefox approached the big 1.0 milestone, its market share continued to soar, and it passed the 10 percent barrier in October 2005.
Web analytics firm OneStat.com is now reporting that Firefox has grabbed an almost 13 percent market share worldwide, while IE has dropped to just over 83 percent. Firefox’s current 12.93 percent market share is up from 11.51 percent in November 2005, while Internet Explorer is down almost 2.5 percentage points. In the US, IE has dipped below the 80 percent mark, down to 79.78 percent, while Firefox has 15.82 percent of the market.
OneStat.com measures browser usage by looking at the traffic at its clients’ web sites. The figures from sites using the company’s commercial traffic analysis package are combined to come up with numbers that represent the average number of visits from a particular browser. According to TheCounter.com, while IE 5 and 6 combine for 84 percent market share, Firefox has just 10 percent and Safari 2 percent. That’s a significant gain for Firefox, which had just 6 percent at the beginning of the year by TheCounter.com’s stats.
Here at Ars, the picture is a bit different. A quick glance at our stats shows that Firefox is the most popular browser with our readers, with 41.92 percent. Internet Explorer accounts for 29.1 percent with Safari at 9.9 percent and Opera at 2.45 percent.
Firefox is even stronger in other parts of the world. In particular, the browser accounts for 39.02 percent of all web traffic tracked by OneStat.com in Germany, with IE sitting at 55.99 percent. Firefox has also broken the 20 percent barrier in Italy and Australia.
With new browsers in the offing from both Microsoft and the Mozilla Foundation, those numbers may be shaken up a bit once Internet Explorer 7 and Firefox 2.0 ship (beta candidate 1 of Firefox 2.0 was released yesterday) later this year. Of course, Internet Explorer is a more radical change from its predecessor, while Firefox 2.0 is a more evolutionary revision. Whether IE 7’s new features and improved security settings will be enough to stop the defections to alternatives remains to be seen.
Nothing says “we’re not a faceless corporation” like a corporate blog—unless that blog is launched by Dell and features product announcements and tours of the “Enterprise Command Center.”
After opening its new blog to the public last week (Name: one2one, Tag: “Direct conversations with Dell”), it didn’t take long for Dell to come in for a blogosphere tongue-lashing. Jeff Jarvis complained that “Dell isn’t listening. And listening, once more, is the first step in blogging.” Steve Rubel made the same critique, telling Dell to “Join us. Be real. Walk the talk.”
It didn’t take long for the complaints to get Dell’s attention. One2one’s newest post, put up only this morning, is headed “Real People are Here and We’re Listening.” To prove it, Lionel Menchaca, Digital Media Manager at the company, went on to provide links to the blog’s critics and said that Dell really, truly, actually wants to join the conversation. “We’re excited to be here,” Menchaca said, “and we welcome your ideas.”
Dell’s week-old experiment in corporate blogging illustrates the difficulties faced by companies who make the decision to engage in a public discussion of their products and their problems. There’s obviously a fine line to walk here between being open to talk about corporate weaknesses and driving away potential business, but one2one shows that the blogosphere has no time for corporations who simply want to use a blog as another PR outlet. To its credit, Dell seems to want more than this for the new site.
The question is whether a corporate blog can ever be more than a marketing site. At some level, such blogs unavoidably become marketing tools—but that’s not necessarily a bad thing. If a company tries to make itself look better by listening to and interacting with customers, that’s the kind of marketing and PR push that we in the Orbiting HQ would like to see more often.
It usually works best when not pitched as an “official corporate blog,” and Microsoft has done a decent job of this with their MSDN blogs, which actively solicit developer feedback. Well-crafted blogs can humanize an organization, but they can also provide valuable, direct feedback from customers to developers and engineers. If done right, corporate blogs can help both the customers and the company. When treated as a traditional PR vehicle, nobody wins.
If you’re a betting man, Microsoft’s Chief Software Architect has some guidance for you on Windows Vista’s as-of-yet undetermined release date. Speaking in Cape Town, South Africa, Gates said that there is an 80 percent chance that Window Vista will be ready for release in January of 2007. Or if you’re a pessimist, you could call it a 20 percent chance that Vista won’t be ready.
After a series of slips that ultimately pushed the OS years off course, the light finally appeared at the end of the tunnel for the Redmond giant when plans for a late 2006 release started to gel. Yet in March the company announced that it would delay the OS until 2007, taking a tone of penance and promising that this time, they wouldn’t rush things. Jim Allchin, co-president of Microsoft’s Platforms & Services Division, said at the time, "We are trying to do the responsible thing here… Maybe in the past we would have just gone ahead but now we’re not going to do that."
Comments from Gates in South Africa struck a similar note, with Gates promising that the company will look seriously at what they learn from the future Beta 2 release. The Wall Street Journal quoted Gates as saying, "We got to get this absolutely right. If the feedback from the beta tests shows it is not ready for prime time, I’d be glad to delay it."
Gates isn’t the only Big Boss at Microsoft sounding the warning alarms. In May, CEO Steve Ballmer emphasized that timing issues with the company’s partners could contribute to a delay. Gates is now suggesting that technical matters could also contribute to such a delay, which is a bit like pointing out that the sun is bright: serious technical challenges can always introduce last-minute delays. However, the public nature of Gates’ comments suggests that Microsoft is indeed bracing for the possibility of this yet another delay.
At this stage, what remains clear is that we are not likely to hear a formal "shipping" date for several more months, as the beta program drags out. Another beta is expected in the coming months, and that beta will be followed by at least one if not two or three release candidates.
Gates also told attendees at the conference in Cape Town that the company was investing between US$8 billion and $9 billion on Windows Vista and Office 2007, the two biggest potential revenue generators for the company in the coming years.