Published by : admin
Jan 01 , 2019 / By :

We've talked before on the subject of a game's UI and how easy it is to get around the menus. If this is done poorly, it can make the entire game feel low-rent and slapped together. The more features you put into a game, like multiplayer and customizable controls, the harder it is to keep the UI elegant and easy to use. Bungie has been spending some serious time making sure the UI for Halo 3 is as good as it can be, and it seems to be paying off. HangZhou Night Net

As it stands now, the interface is designed to make it very simple to get to a "lobby" and make all necessary changes from there – so that you can customize your experience, appearance, and more importantly, really specific game types from anywhere.

Folks who've played a lot of custom games know that it is very frustrating when the "Quick Options" selection just isn't enough to get all the options you want in a game type. You won't have to worry about that from now on. And you can also change your controller setting from anywhere, at any time. That one was a given, right?

Nice. Being able to set up new custom matches was a large part of what made Halo 2 multiplayer so much fun. Add to that the fact that any amount of time spent outside of a game is time you should be playing the game, and you have two compelling reasons to make sure that UI hums. It's going to be great to be able to quickly change up the options in the game between matches, and to have full control of the game whenever you want to make major changes.

It's good to see a developer paying a lot of attention to how the UI works, and it's very cool that they're posting and getting excited about it on their blog. This is one of those things that when it's good, it's taken for granted. When a UI is bad, we all complain. It's a thankless job, but that doesn't make it any less important.

Jan 01 , 2019 / By :

The EU case against Microsoft is getting harsher by the day. The latest word is that the company will indeed be fined for not supplying requested information in a timely manner, with fines backdated to December 15, 2005, at €2 million a day. That’s the current ceiling on the fines, set in the original ruling, but some sources also say that this cap may be raised to €2.5 million or even €3 million a day. That comes out to a maximum about €600 million in total, if Microsoft settles the matter today. Further delay would, of course, also increase the financial hurtin’.HangZhou Night Net

Microsoft continues to claim that the fines are “unjust” because the European Commission wasn’t clear about exactly what information it wanted. EU spokespeople counter that argument by pointing to the two years Microsoft has been afforded for getting their act together. The company claims to be working furiously at collecting the material, and that a final release is due any day now, but some people think that it’s all hot air. “We’ve heard that before and that wasn’t true,” says an unnamed Microsoft competitor. “Why should we believe them now?”

In that light, it looks like the EU is slapping Microsoft a good deal harder than anyone expected, least of all Microsoft. €3 million a day isn’t chump change even for one of the biggest companies in the world, as it translates into about US$1.4 billion at today’s exchange rates. That’s 3.3 percent of the company’s total sales over the last twelve months, or 10.4 percent of its profits. Shareholders would not be happy with fines that large, and that even ignores the princely sum of €497 million Microsoft already paid out to settle the judgment.

I’d say that Microsoft now has a powerful load of incentives to comply with the information request part of the order, above and beyond the original order. Surely these top-secret documents, if made public, couldn’t possibly cost the company more than what it costs to pay these fines indefinitely. If the problem truly lies in difficulties gathering the required info, it’s also a great incentive for improved documentation standards.

Jan 01 , 2019 / By :

If you like your major motion pictures without cusswords and nudity, you may be out of luck. A federal judge in Colorado has ruled that it is illegal for third-parites to edit and release sanitized versions of movies.HangZhou Night Net

Some background: a couple of years ago, we reported on a company called CleanFlicks which had drawn the ire of the motion picture industry. Based in Utah, CleanFlicks uses a homegrown system to check DVDs for questionable material and cleanse it from the movie. One of the results was a version of Good Will Hunting with 125 f-bombs and other colorful expletives muted.

When the technology was shown to a group of directors and other Hollywood types, they were furious. They felt that CleanFlicks and other companies offering the same sort of service were at best, undermining the artistic integrity of their films. The sanitizers argued that there was a strong demand for their services and that their edits were covered under Fair Use. Soon enough, a handful of companies offering movies appeared. Predictably, lawsuits ensued.

In his ruling, US District Court Judge Richard Matsch sided with the Directors Guild of America. He has handed down a permanent injunction prohibiting CleanFlicks, CleanFilms, Family Flix USA, and others from selling or renting the "cleaned-up" movies. In addition, the companies have five days to hand over every sanitized disc and videotape in their inventories to the studios for immediate destruction.

According to the judge, it is up to the moviemakers to decide who their film was appropriate for and tailor it accordingly. "Whether these films should be edited in a manner that would make them acceptable to more of the public playing on a DVD in a home environment is more than merely a matter of marketing; it is a question of what audience the copyright owner wants to reach," he wrote in his ruling. "What is protected are the creator’s rights to protect its creation in the form in which it was created."

An appeal is possible, although one of the defendants—FamilyFlix—has already shut down operations. In the meantime, those who find cursing and nudity in movies unacceptable are advised to simply not watch films that contain content they find objectionable.

Jan 01 , 2019 / By :

Last week, Microsoft announced that it would be developing its own version of an ODF plug-in for Office 2007. The company claimed that it would be releasing the Open XML Translator via SourceForge.net, a popular site for hosting open source applications. Since the Open XML Translator's source code is freely available, it's no wonder that people would not only be perusing through it but also comparing it to the OpenDocument Foundation's ODF Plugin code. Pamela Jones did just that, and she claims that Microsoft swiped some of its source from J. David Eisenberg's program that converts ODF to HTML which is licensed under both the LGPL and Apache 2.0. HangZhou Night Net

Groklaw has a comparison of the pieces of code (XSL) in question, and while they do look very, very similar, they also have some minor differences. Still, Jones questions whether Microsoft is actually allowed to copy the code, and goes on a protracting, sarcasm-laden rant about what Eisenberg should do to Microsoft for being a copycat.

Naturally, David will be filing a major lawsuit asking for billions and gazillions in damages. He'll probably wait about 5 or 6 years though, until lots of folks are using the Microsoft plugin, and in the interim, he'll donate some code to the Microsoft project and distribute it himself, and *then* he'll announce he's shocked, shocked to discover his code inside the Microsoft plugin, call a press conference and let the media know he will be suing Microsoft. They have deep pockets, after all, and he has to consider his shareholders. A man can't just sit around the camp fire singing Kumbaya when there's money to be made.

And of course he'll also have to sue corporate end users and petition the courts to shut down their businesses under the DMCA, and he'll issue sanctimonious press releases about his stolen Most Holy IP and how Microsoft is made up of a bunch of lunatic fringe criminals who don't respect other people's intellectual property. Maybe he can grab some headlines by sending a letter to Congress.

Jones loathes Microsoft, but her mockery is mostly directed at SCO. Nevertheless, her point pertains to Microsoft as well and is worth talking about. Over the years, Microsoft has gone out of its way to bury open source projects, even if it is now starting to dabble in Free and Open Source Software (FOSS). How could the company all of a sudden not only acknowledge open source software's popularity, but take a little code from it as well? It's easy—the company's in-house devs didn't write the Open XML Translator. A Microsoft partner did instead.

The Open XML Translator was developed by the French company Clever Age. Their site can be found here (translated). My guess is that, since the company creates open source projects on a regular basis, one of the developers probably didn't think anything of it when he borrowed the code. Credit was in fact given to Eisenberg in a comment where it says, "Extra spaces management from J. David Heisenberg." The programmer did misspell his name though, and that's kind of a slap in the face to Eisenberg. "Please fix the typo on my surname; I'm certain there's no H in it (obligatory Heisenberg joke)," he said.

So I ask if this is really a bad thing that Microsoft, or more precisely Clever Age, has done. Should Microsoft step in and change the code? Is this an embarrassment or an achievement for the king of proprietary code? Eisenberg told Jones that he'd be willing to license the code to Microsoft under the BSD license, and Microsoft will hopefully accept the offer once it's on the table.

Update: I missed the SCO connection completely. Everyone who said she was talking about SCO in her rant was right, and I've corrected the article.

Jan 01 , 2019 / By :

The Tao of Mac Weblog has put into words what I'm guessing sums up the feelings of many .Mac users. Rui's main crux is that US$99 is way too much to pay for services that haven't really been updated to meet modern expectations. Google's Gmail is mostly to blame, however many of .Mac's services and their pricing seem to be vastly out of line with what you can get by using 3rd party products. Applications like JungleDisk (works with Amazon's S3 service) is vastly cheaper for online disk space, and the aforementioned Gmail or Yahoo! Mail fills the bill for web mail. HangZhou Night Net

On the topic of mail, Gmail is obviously free and actually does a decent job of spam filtering on its own, whereas Apple's .Mac mail service is limited to whatever free space you have on your iDisk account and has woefully inadequate spam filtering when put up against Gmail's service. Apple relies heavily on Mail.app's spam filtering capabilities, which can be hit or miss. You lose all of that "built-in" functionality when you eventually come to the realization that you've outgrown the client. Sure you can use SPAM filtering in your client of choice, but that's not always an option. For example, I often check my mail accounts via my cell phone and other portable devices that don't have the horsepower to filter every message. Server-side filtering is the way to go, and as it stands, I'm pretty sure that a large percentage of Apple's .Mac users stuck around for the email address/service alone.

Rui gets down on the other parts of .Mac: syncing, web albums, iChat, and the notoriously slow iDisk. He also notes a few examples of what Apple could do to sway many users to their service: IMAP IDLE support, Moblogging support, over the air iSync, and contact and calendar syncing.

One thing to be kept in mind is that while these kinds of features and beefing up .Mac in general will sway some of us nerds over to .Mac, for many users it's a handy service that integrates with many of their applications and lets them do a lot of things that they would otherwise have no idea how to accomplish. For people with limited computer skills, .Mac is worth US$99 per year and more. However, there is always something to be said for attracting "power users" and other "influencers" to a service. You want the people with the biggest mouths and the most reach to speak highly of your product, not dog it at every opportunity.

With all of that said, I'm almost certain that Apple has to be working on some sort of functionality upgrade to the service. In the face of all these new competitors, and without a significant update in features or pricing in what seems like forever, I'm betting that Apple has something exciting up its sleeve in the near future.

Jan 01 , 2019 / By :

According to documents filed by Creative with the United States International Trade Commission in mid-May, Steve Jobs approached a Creative employee at MacWorld 2001 to talk shop about Creative’s then flagship product, the Nomad Jukebox. What begun at this innocuous meeting may well have led to the creation of a digital audio empire that could have rivaled what Apple has put together on its own. On the other hand, it may have also led to stunning failure for both parties.HangZhou Night Net

In the past few months, one of Apple’s biggest rivals in the digital audio player space, Creative, has begun competing with Apple using legal tactics. Looking back before the birth of the iPod, Creative had a significant lead on Apple, with almost a full year’s experience in developing and selling hard-disk-based players. What is almost completely unknown, however, is that Apple was willing to work with Creative to develop an Apple-licensed player as opposed to building their own from scratch.

According to Creative’s legal filings (available via PACER) with the US International Trade Commission, Apple had originally sought to license Creative’s IP and create the iPod on Creative’s platform. The filings reveal that Apple was unsure that this scheme was going to be profitable in the long run, and proposed the radical (for Apple) idea of partnering with Creative to create a digital audio player. Creative decided against joining forces, and the rest is history.

While one might be quick to assume that a merger between the two companies would have led to a marriage punctuated with hundred dollar bills falling from the mirrored ceiling of their honeymoon suite, it may also be that a mating of these two wildly different corporations would have been destined for failure. Apple’s reckless abandon coupled with Creative’s bulky hardware and stodgy corporate culture would likely have resulted in clashes of personality, among other things.

As it stands, Apple scored big when Creative turned down their proposal, freeing them up to engineer the iPod as they had envisioned it, rather than working off of one of Creative’s prototyped designs—or using Creative’s patented IP, as the lawsuits allege.

Creative and Apple’s relations were dotted by pithy remarks and under-the-belt jabs until this lawsuit business reared its head. Apple was able to take a radically different approach to marketing, growing their market while making a lot of money. As a result, the iPod has become synonymous with the portable digital audio player market. Today, Creative and Apple stand with patent lawsuits loaded, waiting for the other to flinch first—unless they can come to a settlement themselves.

Jan 01 , 2019 / By :

VoIP service provider Vonage has acquired three patents from IP license aggregator Digital Packet Licensing LLC. The patents (4,782,485, 5,018,136, 5,444,707) deal with various aspects of data compression for VoIP traffic, and are already the basis for lawsuits against Sprint, Verizon, AT&T, and Nortel, among others. In the press release about this acquisition, Vonage president Michael Tribolet says that “The acquisition of these patents is part of Vonage’s strategic plan to further develop our core technology and build on our innovative, affordable and feature-rich phone service. Vonage’s strategic plan also includes a concerted effort to grow, create and acquire other significant intellectual property portfolios.”HangZhou Night Net

So far, this sounds an awful lot like patent trolling, doesn’t it? Buy a portfolio of patents with existing licensing deals and a couple of infringement lawsuits already in motion, then stand back and reap the rewards. But the situation isn’t quite that simple.

In other news yesterday, Vonage was itself slapped with a fresh patent infringement lawsuit, this time by a tiny tech outfit by the name of Klausner Technologies. Fresh off a successful lawsuit against AOL, based on the same digital voicemail patent (5,572,576), Klausner is asking Vonage for US$180 million in damages and royalties. Vonage is also already the target of a lawsuit from Verizon, where the telco claims infringement of seven different patents.

Where there’s smoke, there’s fire; maybe the Digital Packet patents were bought as insurance and defense against the incoming stream of suits, rather than aggressive tools for seeking easy revenues from lawsuit settlements or judgements. You be the judge.

I’ve been a fan and user of Vonage’s phone service for years, so there’s nothing wrong with the service in my opinion. The business side of Vonage is a different story. The company is hemorrhaging cash and may never be profitable. That’s not conjecture, but clearly stated in the company’s prospectus for its IPO filing. Vonage looks like a train wreck in progress, and these patent issues may speed or slow the progress of that, but won’t stop it either way. I’m afraid I’ll have to change my phone provider soon, maybe in the next two or three years, but not because I’m unhappy with my current one. It’s because I think it may go out of business.

Jan 01 , 2019 / By :

Science, and the pursuit of science, is often held up as a paragon of integrity. Noble researchers seek out empirical truths regarding the world around us, reporting their results and sometimes challenging dogma. Therefore, when reports or accusations of scientific misconduct of fraud occur, they sting all the more. Everyone expects a politician to be dishonest, but the men in white lab coats are held to a higher standard. HangZhou Night Net

A notable recent example concerns the work on human embryonic stem cells conducted in South Korea under Dr Woo Suk Hwang. Readers may remember the excitement that surrounded the announcement, and then the furore following news that results were faked or enhanced, and that the postdocs working in the lab were pressured into donating their own eggs to the project.

Lest you think it only happens in South Korea, the past couple of years have seen other examples of scientific misconduct, and as a result, journals are beginning to take the issue even more seriously. This brings me to an article in last week's Science. As I've covered before, postdocs make up the backbone of most research labs in the US. Postdocs depend on good publications in the fight for the few faculty appointments, and often at the mercy of their PIs due to the fact that postdoctoral positions fall somewhere between jobs and studentships when it comes to employment law. More and more postdocs in the US are reporting that they too are subject to the same kinds of pressures experienced by those working under Dr Hwang. Worse yet, those postdocs that try to speak out about the problem find that being the lowest rung on the ladder means that if it's a choice between them and the high-flying tenured PI, there's no contest.

Stories abound about labs where studies start with preconceived endpoints, where data that 'doesn't fit' is left out, or worst of all, data that is knowingly false is published anyway. It is almost understandable. Funding is becoming ever harder to obtain, and there is constant pressure from University administrators to bring in more grant money and publish in prestigious journals. Almost, but not quite. If studies are approached with a fixed conclusion at the outset, how does that differ from the proponents of ID? And the more such scandals occur, the closer the public comes to rank the men in white lab coats alongside baby-kissing politicians.

Please don't get the impression that labs such as these are the norm; data-fudging pressure cookers are the exception to the rule, but when they coincide with high-profile labs, it has more of an impact. In the decade that I've been working in research, I may have heard stories in the pub regarding other labs, but can't think of a time when anyone has told me to pretend a result never happened, nor have I seen it happen to anyone else.

Jan 01 , 2019 / By :

We've reported on Steam keeping track of statistics about how people are playing Half-Life 2 Episode 1, but this one is a thinker. Valve has released an update to the game, and it seems as if the biggest point was to make the game easier during the elevator scene. Remember, the game already has multiple levels of difficulty, but based on their numbers, this section was too hard for many players. Enough to change the game for everyone? HangZhou Night Net

I loved the elevator portion of the game—it got my blood pumping. While I died more than once, when I was finally able to beat that section I got a nice sense of accomplishment. There's nothing worse than being stuck in one area in a game, but of course there's nothing better than finally getting past it and continuing on. You can't have one without the other. While I like the fact that Valve is trying to keep their customers happy, couldn't they have added this only to the lower difficulty levels and allowed those of us who like a challenge to enjoy the game as it was originally designed?

It's also worth nothing that this update is applied when you start Steam up, whether you want it or not. Outside of unhooking your internet connecton whenever you start Steam, there's nothing you can do to keep this out of your system. The idea of patches that are near mandatory makes me more than a little nervous, and it's a side of digital distribution we haven't really talked about. I know a lot of people who play a lot of games that they keep at a very specific version number, simply because they dislike what certain patches do to the game. If we lose that ability and a bum patch or update is released, you're going to have some grumpy people playing your game. Especially if you like a higher level of difficulty.

Jan 01 , 2019 / By :

To an extent, there are two schools of thought about how to go about screening for a new drug. One could be called the "Intelligent Design" method: choose a target, study its structure and biochemistry, and then design a drug that specifically interferes with it. In this case, much of the work goes into understanding the system. The second, called "High Throughput Screening," takes a more evolutionary approach. All you need in this case is a test tube version of the process you want to affect. You search for drugs simply by dumping every chemical you can get your hands on into the test tube and looking for those that inhibit the process. Once an effective compound is identified, variations of it are synthesized and tested in turn. HangZhou Night Net

Although less targeted, high throughput screening has some advantages. You don't need a complete understanding of the process you're trying to alter in order to test compounds, meaning it will work with complex and poorly understood disease targets. Malaria ranks pretty high up on the scale of complex and poorly understood diseases, and a reader was kind enough to point me to an article that describes a screen of compounds for the ability to inhibit the growth of drug resistant strains of the malarial parasite. The technique also took advantage of a different aspect of high throughput screening: the majority of the drugs they tested have already been approved by the FDA, meaning they've passed rigorous safety screening, and would be relatively inexpensive to bring to market for a new use.

Nearly 200 drugs were identified that reduced parasite cell division by over 50 percent. The authors focused in on an antihistamine called astemizole, which can be ingested orally. Testing in mice showed that it knocked down the load of the malarial parasite by about 80 percent. The drug itself is no longer on the market in most of the developed world, since alternatives with fewer side effects are available. As the patent on the compound has expired, however, it is being sold in over-the-counter form in the developing world. The authors also note that, as a successful drug, hundreds of variants on it have already been synthesized, and may be worth testing for enhanced anti-malarial activity.

The work was done at Johns Hopkins, and there's an interesting coda in the final paragraph that shows that this promising work may not be the last high throughput screen to use this panel of drugs:

Currently, the JHCCL is undergoing expansion to include every available drug ever used in the clinic via phase 2 clinical trials or approval by the FDA or its foreign counterparts. When complete, the JHCCL will be available to any researcher interested in screening for existing drugs that may be useful as economically viable new therapies for diseases of the developing world.

Hopefully, this won't be the last time we report on screens using this panel of drugs.