As the debate over Windows Genuine Advantage rages on, Microsoft is attempting to rein in speculation that the antipiracy tool could be used put an abrupt end to the use of pirated versions of Windows. A spokesperson for the company firmly denied that the tool would be used in such a manner, saying that "No, Microsoft antipiracy technologies cannot and will not turn off your computer." Confusion remains over just what WGA is designed to do.
Dislike for WGA and what it represents has been brewing for years, stemming back to the release days of Windows XP. At that time, Microsoft required new copies of the Windows XP operating system to "activate" over the Internet using Windows Product Activation (WPA), a process that required a user’s consent to send identifying information about their computer and OS to the company. While that information was essentially nothing more than an authenticity code coupled with select system specifications, many users were uncomfortable with the tactic. Still, it was much like a tetanus shot: one quick
prick click, and it was over.
Piracy, of course, lived on, and WPA has largely been assessed as a victory only to the extent that it stopped many forms of casual piracy. With Windows Genuine Advantage, Microsoft is looking to improve on the anti-piracy tools of 2001, and WGA is best understood as the heir to WPA. Whereas the original tools only required activation once in the first 30 days of use, WGA is designed to constantly monitor a system’s licensed state. In very general terms, the idea is to make life as a so-called pirate difficult.
"The game is changing for counterfeiters. In Windows Vista, we are making it notably harder and less appealing to use counterfeit software, and we will work to make that a consistent experience with older versions of Windows as well," said a spokesman in a statement.
Still, the company has not fully disclosed their vision for WGA, leaving many questions unanswered. After talking with several trusted sources about Microsoft’s plans for Windows Vista, I believe I can shed some light on the reasons why WGA behaves as it does, and why Microsoft will indeed be using antipiracy strategies that continue to monitor one’s licensing state long after the initial setup.
The itch that WGA scratches
WGA is designed to identify a computer’s licensed state and to report that state to Microsoft. Generally speaking, Microsoft wants this information for two reasons. First, they want to fight casual piracy, and this is one way to discourage it. The company believes that tools such as WGA will make it less likely for people to share OS copies or install the same OS throughout, say, their home.
Second, they want you to be wary of pirated software, and this is one way to encourage that. Microsoft believes that commercial forms of piracy are especially egregious because they typically involve a third party selling counterfeited software—software that Microsoft ends up supporting for free. WGA is designed to kill two birds with one stone by tying OS updates to WGA monitoring. The end result is that Joe Consumer has a good reason to make sure his software is legitimate (to get updates), but there’s also a new side effect: the company believes that if Joe Consumer learns that he was sold counterfeit software, he’ll help nab the crooks, as it were. You can see this aim in Microsoft’s policy regarding known cases of OS piracy:
"Qualifying customers who fill out a counterfeit report, provide proof of purchase, and send in their counterfeit CDs may receive a genuine copy at no cost. Customers may also purchase an electronic license of Windows XP Home for $99 or Windows XP Pro for $149, or from their favorite local resellers," the spokesperson told Ars Technica.
From Microsoft’s point of view, if you have pirated/counterfeit software on your computer, you’re either a victim or a pirate. If you’re a pirate and caught, or if you’re a victim but have no proof, you can buy legitimate keys. If you’re a victim and you can prove it, you get a free replacement. The program is clearly designed to smoke out counterfeiters while collecting licensing fees.
The move to constant monitoring
This still does not address the change from a one-time authenticity check to what is essentially constant monitoring. To explain this, I offer the following hypothesis: constant monitoring is going to become very important with Windows Vista. Here’s why: the new OS will be the first from Microsoft that supports upgrades on the fly, allowing users who purchased one version of Vista to "upgrade" to other versions by simply obtaining a new license key and inserting their old installation disk. Dubbed Anytime Upgrade, the program takes advantage of Windows’ modular design. When consumers head to the store to pick up Windows Vista next year, they will actually be picking up media that has all flavors of the desktop OS on it, regardless of what the box says. Joe’s Windows Vista Home Basic disc will also have all of the features found in Vista Ultimate, and Joe can activate those features for an upgrade price to be announced later.
The end result is that the OS can be upgraded "in place" using existing media. The benefits are obvious: Microsoft hopes that users will learn about features in the Premium and Ultimate versions of Vista and want to upgrade, and the chances of them doing so are much higher if they already have the media and a simple way to obtain a new licensing key. It could even become an impulse buy. (And I must point out that this can also be used to sell future updates to Vista as well.)
The potential rewards also come with risk. What is to stop users from buying the cheapest version of Vista (or even pirating it, for that matter) and then using hacks to easily upgrade to the best version? This is where WGA’s persistent monitoring comes in. Through updates delivered to the application, known exploits will eventually be identified, or so the company hopes. Post installation hacks, whether to gain new features or change product keys, can now be identified in the field and targeted dynamically instead of waiting for the next major service pack (which could be years away). Furthermore, valid keys that are leaked can also be quickly disabled, although the company hopes to have another solution for leaked corporate keys in place soon. The fight against key leaks explains why the persistent monitoring will also be applied to OSes such as Windows XP, which cannot take advantage of Anytime Upgrade.
As we move closer and closer to a world where portable physical storage formats will be replaced by high-performance networked storage, software developers are drooling for a safe way to sell software and software upgrades online, cutting out the middle man. Some are doing it already, others want in. For big-time targets of piracy such a Microsoft, the rush to sell software online must first be subjugated to antipiracy strategies. Microsoft and others know that post-installation exploits can be attractive for pirates, even sophisticated exploits that involve more than just replacing a specific DLL or editing a registry key. For Anytime Upgrade and its forthcoming brethren to be a success, persistent monitoring is going to be part of the equation.
Many may have heard by now about a very vocal group of white MacBook owners who had begun to complain about the palm rest areas of the white plastic on the MacBooks were beginning to stain. Apple's original response to this was that the MacBooks were subjected to "improper handling" by owners that ultimately caused the odd discoloration on the otherwise pristine-looking casing of the computer. Truth be told, even Infinite Loop's Clint Ecker says that he was convinced that his MacBook palm rest stains were, he thought, a result of merely using the computer with "dirty hands."
Well, Clint, you can pick up the pr0n again, as Apple appears to have changed their stance on MacBook stains. While cosmetic issues on Mac laptops and desktops have never been covered under AppleCare, Apple seems to have decided that the issues with discoloration on the casing of white MacBooks is, in fact, not necessarily due to improper handling by users but due to a manufacturing defect.
What kind of manufacturing "defect," you ask? Well, nobody knows. However, Apple is offering to replace the top case of your stained MacBook if you are being affected with the discoloration affliction.
Those with discolored MacBooks should contact AppleCare and inform them of the stain problem. The replacement top case should no longer have this issues, nor will the newest Macbooks as Apple seems to have changed the plastic, the newest MacBooks having a much smoother feel to them in those areas, while the ones with the problem feel rough.
Of course, the mere existence of such a cosmetic issue on machines that are often considered to be amongst the sleekest and sexiest laptops can't look good for Apple, however it seems that they're hoping that public acknowledgement and fixing of the problem will save their (somewhat struggling, these days) PR image.
What would you do if your web site was “sandboxed” by Google? If you’re children’s search engine KinderStart.com, you’d spend a few weeks wondering why site traffic had fallen by more than 70 percent and why AdSense revenue was in the toilet. Then you’d sue Google.
The company filed a complaint (PDF) against Google earlier this year in which KinderStart alleged that Google’s behavior was monopolistic and violated the smaller company’s “constitutionally guaranteed freedom of speech.” The blockage of KinderStart’s site from the Google index allegedly occurred without warning or notification of any kind, and the company’s attempts to get answers from Google have fallen on deaf ears. Now KinderStart wants its day in court, and its complaint seeks class action status for all companies similarly affected.
Google is no stranger to these sorts of lawsuits. Courts have so far been unwilling to rule against the search engine because it is a private business that is allowed to make its own editorial decisions about what will and will not be included in the Google index. Google agrees, and in its motion to dismiss points out that chaos would result if courts got involved in the search engine business.
Plaintiff KinderStart contends that the judiciary should have the final say over that editorial process. It has brought this litigation in the hopes that the Court will second-guess Google?s search rankings and order Google to view KinderStart?s site more favorably. If KinderStart were right, and websites could use the courts to dictate what the results of a search on the Google search engine should be, neither Google nor any other search engine could operate as it would constantly face lawsuits from businesses seeking more favorable positioning. Fortunately, KinderStart?s position finds no support in the law.
Though the judge in the case has shown skepticism toward most of KinderStart’s claims, he did show interest in the charge that Google may be abusing a monopoly position in order to silence competitors. He may now give KinderStart time to amend its complaint with more specific information ahead of a September 29 hearing. His concern is apparently that Google has taken some sort of action against a rival search engine, a move which could be seen as an abuse of market power. KinderStart agrees.
“What Google is trying to do is take out the competition,” said Gregory Yu, KinderStart’s attorney.
This claim seems difficult to reconcile with the fact that Google provides ready access to far larger search engines like Yahoo, MSN, and Ask, just as it is difficult to make the case the the Big G is actually a monopolist (it currently serves up less than half of all Internet searches in the US).
It must be frustrating for a site like KinderStart to have its traffic dry up without explanation, but the business model has a problem if 70 percent of all traffic comes from Google searchers. Such a number suggests that the site has been unable to attract repeat visitors or to build its brand as a one-stop destination for children’s information. It also highlights the need for a diverse revenue base, one that is not built solely on AdSense.
I'm not going to say that games are making as much money as movies, because then the comments would be nothing but numbers and people arguing about the box office vs. DVD sales, so let's all agree that games are a huge business. The issue of course, is that no one really seems to know how to market them. Commercials are nearly worthless, no one in the target demographic watches them, and what can you say about a game in thirty seconds that will make people want to play it? You have games like Halo, where teasers and trailers will be drooled over and argued about for months, but not every game is a Halo. So how do you get the excitement level for games up to movie levels?
This article has a few ideas, but each of them has a drawback. Hardcore events… but what about the casual gamer? Viral marketing… but that only works with people who tend to spend a lot of time on the Internet anyway. The game version of King Kong simply used the momentum of the movie to great success.
It's clear that there aren't a lot of strategies out there that work for games, and most of them will simply appeal to a core audience and then rely on word of mouth from there. The example offered by Blitz: The League is a great one, however. When they didn't have an NFL license they played that aspect of the game up, saying they were showing you and letting you do things that the NFL would never have agreed to. That lead to a lot of buzz as people wanted to see what the NFL was so scared of. The NFL wasn't scared of anything, they were just bought out by EA. The trick worked though, and the game sold well.
You have to admit, that's pretty brilliant. Maybe it's just a matter of ad execs thinking on their feet and coming up with something just as good for every game. Of course that takes time, and creativity. Two things both the ad and gaming business are often lacking in.
There's a time in every gamer's life where you have to make the decision: do you enjoy playing videogames, or do you collect? You can do both of course, but there's certain behavior in each group that usually makes people fall into one specific camp. People who just like playing games have no trouble selling their titles when they finish them. Collectors, on the opposite extreme, will often buy rare games they find even if they won't get to play them for a long time, if ever. In fact, the game may just sit in the shrinkwrap, never loved, just so someone can say they have a mint copy of Suikoden 2 on their shelf. These are the sort of people who will look at you sadly if they see Greatest Hits versions of games on your shelf.
Having my game collection in storage has broken the collecting bug; the only games on my shelves are the ones I'm playing or reviewing. Once I get those racks and racks of games back in the house? It's over for me. A game goes into the collection, and it completes a series or simply makes one system's row look bigger, and it's never leaving. From my copy of X-men Legends signed by Stan Lee to my Dreamcast shooter collection, there's a lot of stuff I could never part with.
Gamespot has a feature that shines a light on the seedier side of game collecting, with the editors talking about the five games they have on their shelves that they think make them look cool. Sometimes it's an underappreciated gem, sometimes an import, sometimes it's something as mundane as Guitar Hero that's there to prove they can still rock. Admit it, if you collect you have one or two things on your shelf that you always hope someone notices when they come over. The best reward for collecting is someone pulling a game out of your stacks and exlaiming, "Oh man, you have a copy of this?!?" There simply isn't a better feeling in the world.
If nothing else, this feature should remind you just how cool Otogi is.
What happens when rich, poor, and middle-class countries get together to agree on future IP regulation? If your answer was anything but “gridlock,” you’re an incurable optimist.
Last week’s WIPO (World Intellectual Property Organization) meeting was supposed to mark an important step forward for the WIPO Development Agenda, but instead left it standing in place. The Development Agenda began in 2004 when Brazil and Argentina introduced a proposal for future WIPO regulation that would make WIPO into something more than the international IP police. Future WIPO decisions would be guided by this Development Agenda, which seeks a balance between copyright holders and the public, especially in developing countries where access to IP (think patented drugs, for instance) is a huge concern. The Agenda would also make WIPO into more of a development body by directing the organization to provide technical assistance for developing countries.
WIPO agreed to adopt a Development Agenda and held three meetings on the topic last year. What emerged was a set of 111 proposals (PDF) grouped into six categories. Last week’s meeting of the Provisional Committee on Proposals for a Development Agenda (PCDA) was supposed to come up with a recommended list of proposals to present to the WIPO General Assembly in September. Coming up with proposals is simpler than agreeing on them, though, and the meeting ended up with little consensus on the most important issues.
The EFF attended the meetings and has posted both notes and transcripts on their web site (day one, day two, and day three). Progress was impossible, as the group could not even decide how to evaluate the proposals. On the last day, Brazil and Argentina both announced their withdrawal from the meeting due to concerns that the method of selecting proposals for recommendation had been unfair and that most of their core concerns had not been included. In the end, the PCDA punted, sending the matter back to the WIPO General Assembly.
Depending on what proposals are ultimately passed, the Development Agenda could have a significant impact on issues such as health care and the public domain in countries across the globe. It could also ensure that countries have some leeway in passing their own IP laws, rather than following WIPO decisions in lockstep. Finally, the Agenda would put much more emphasis on technology transfer and technical assistance designed to benefit up-and-coming countries who want to compete in the knowledge economy.
Whether the full General Assembly can come to more consensus than the PCDA remains to be seen. The September meeting will no doubt be contentious, as the EU and the US wield so much power and oppose many of the projected changes.
The Seattle Post-Intelligencer just had a chat with Blake Ross of the Firefox team, talking over points such as the project’s success and how Microsoft motivated the whole project. Blake says that Microsoft’s lack of browser innovation in the absence of decent competition makes him “furious,” and that the Google tie-in is based more on the search engine’s quality than on any marketing agreement.
But while it’s a good read, the interview is short on future direction. Let’s take a look at the published plans to see what’s coming up in the next couple of major Firefox releases.
First, there’s Bon Echo, aka Firefox 2.0. We have reported on that version’s progress a few times already, so let me just point out a few corrections and changes to the earlier plans. It has been said that Windows ME support would be dropped, but apparently not in Bon Echo. The “priority 1” platform support list includes Windows Vista/XP/2000/ME (but not 98), as well as Mac OS X 10.2 and up, and Red Hat Linux (no specific version). Other Linux versions are P2, meaning they will probably be supported but there are no guarantees.
Support for themed or branded builds is high on the wish list, indicating a desire to branch out to more distribution partners. Some functions currently handled by elective extensions, such as session resume after restarts and crashes, or on-demand spell check functions, are slated for inclusion. Syndication feed handling needs more work, as does the fit-and-finish of the overall application interface. The stated goal is to make Firefox look and feel like a native application across Windows, OS X, and Gnome environments.
Otherwise, not much seems to be changing. There are no major code overhauls here, and a greater focus on bugfixing than on performance improvement. In addition, the new database version of the bookmark and history systems have been shelved for now. That system and overall performance improvements are scheduled for Firefox 3 at this point, along with greater standards compliance, better security, and i18n internationalization support.
In general, it looks like Firefox 2 is a spit-and-polish job, designed to look and feel as professional as possible, and leaving the really big changes for the next major release. Along with the marketing-friendly branding feature and the “sizable chunk of revenue” the Mozilla project has amassed from search engine deals and the like, I think I can smell a nice, big marketing push alongside the final release of Bon Echo. Will we all be sick of Firefox TV commercials soon?
Music industry lawsuits: they’re not just for the West anymore. The International Federation of the Phonographic Industry (IFPI), the international body charged with protecting music labels, has announced its plan to sue Yahoo China within the next couple of weeks over the search engine’s alleged links to sites hosting pirated music.
Similar action was also taken last year against the leading Chinese search engine, Baidu.com. The IFPI’s threats are based on the claim that it is illegal to link to illicit material, a claim bolstered by a new Chinese law. Bloomberg has the details:
The federation is also considering using a new Chinese law that came into effect July 1 that fines distributors of illegally copied music, movies and other material over the Internet as much as 100,000 yuan ($12,500). As of today, Chinese search engines operated by Yahoo China and Baidu.com provide links to other Web sites hosting illegally copied songs.
The law says a Web site is jointly liable with the host of the pirated files for infringement “if it knows or should know that the work, performance or sound or video recording linked to was infringing.”
Though the explicit threat of a lawsuit is something new, IFPI chief John Kennedy used a May speech in Shanghai to indicate that his organization was prepared to take on more Chinese search engines over the issue of piracy.
It is clear that the ISPs are far from adequately supporting us today. I have been very disappointed in recent months to see some well-known brand names among the internet companies blatantly infringing our members’ rights. Baidu has already been found guilty of copyright infringement in the Chinese courts; China-Yahoo is now in a similar position, choosing to turn a blind eye to the infringements taking place on its service instead of setting the example of responsible practice which we would expect from them. We are watching China-Yahoo closely and will have no hesitating in acting to protect our members’ rights if we should have to.
Though such a lawsuit would hassle Yahoo China (not actually run by Yahoo, but by Chinese operator Alibaba.com.cn), the more interesting aspect to this story is what the recent legal cases say about China. The country has not always been a role model for the rule of law, nor has it always appeared interested in fighting piracy. The situation got so bad that the US threatened to go to the WTO and seek sanctions, but China has recently been saying and doing all the right things. The IFPI suits indicate the entertainment industry’s belief that legal threats now mean something in China, and that the Chinese market is poised for enough growth to make the legal effort worthwhile.
Such moves, coupled with statements from Beijing about the need for China to get serious about enforcement, suggest that the Chinese IP free-for-all may be slowly winding down. With piracy rates for music and software still hovering at 90 percent, substantial change will come slowly, but there’s little doubt that it’s on the way.
July 4th approaches, traditionally the time the US celebrates throwing off the shackles of English Imperialism, with the consumption of large quantities of alcohol and the combustion of large quantities of explosives. However, not everyone is hoping for a big bang. NASA had been planning to return the space shuttle into the skies, and explosions are definitely not on the agenda.
The space shuttle Discovery has sat on the launchpad in Florida for several days now, waiting for the all-clear signal, but first weather, and now safety inspections have been raising red flags. The problem remains the insulation on the large fuel tank that contains the liquid oxygen and hydrogen that fuels the shuttle's main engines. The tank is covered with insulation foam, and bits of this foam can fall off, damaging the orbiter on the way down. Such damage was responsible for the loss of the shuttle Columbia three years ago.
The launch was planned for tomorrow, July 4th, but NASA are meeting today to determine whether or not it is safe to proceed. There are also concerns over whether the repeated draining and filling of the tank is causing flexing that might be the cause of the cracks.
NASA isn't just worried about foam debris. Cape Canaveral is also home to lots of vultures, who feast on the roadkill that results from all the interested visitors. When a 1 kg chunk of foam can result in the loss of an orbiter, a bird strike involving a vulture weighing three times that is a serious problem. As a result, NASA have set a trap several miles away to keep the vultures clear of the launch site. The birds will be released once Discovery is on its way.
When I think of lasers, I think of precision optical devices that take hours of careful alignment to optimize. The reason for this is that stimulated emission is usually much weaker than spontaneous emission and without all the feedback created by carefully placed high reflectivity mirrors there would be no laser. This description, while accurate, ignores the overall physics behind the laser, which is best seen when looking at random lasers.
Random lasers have been around for a few years now and a recent article in Journal of Applied Physics has given me an excuse to write about them. Instead of creating your laser from a nice optically transparent medium surrounded by mirrors, you have simply blast a powder with energy, causing the particles to glow. The trick is that each particle in the powder will reflect light in all sorts of directions and between certain pairs of particles there will exist a path for the light to travel back and forth as though the particles were mirrors. Between the two "mirrors" the light will travel along a complex, narrow path that is much greater than the straight line distance between the two. Thus, in contrast to a conventional laser, where a large volume of material must be excited, only a few atoms take part in the laser process so it is relatively easy to get those atoms in the correct state to lase. Since all the atoms are ready to lase, the mirrors at the end points don't need to be all that good. The best thing is that in any powder this occurs for huge numbers of particles so you end up with more than one laser. This paper uses zinc oxide embedded in a polymer matrix, which holds the powder in a fixed formation, making it much easier to work with. Zinc oxide, when sufficiently excited, lases in the ultraviolet region, where normal semiconductor lasers will not operate. Zinc oxide also has the potential, through some trickery using nonlinear optics, to produce a huge range of colors from the ultra violet out to the mid infrared – a spectroscopist's wet dream.
These laser sources are the very antithesis of normal laser development. No care in construction is required and the system is incredibly robust. If you shake the thing up and down a bit – something that will cause an ordinary laser to hemorrhage – all you get is a group of new lasers. Of course, since the laser is random, no one really knows in which direction it will emit. Since only a few particles make up each laser, the power doesn't really scale to very large numbers. However, this isn't so bad since many laser applications don't require high power but would benefit from reduced cost. The biggest problem, however, is getting these things started. At present a huge pulsed laser, the likes of which would not fit in your CD player is required, which is kind of a problem for those thinking about monolithic applications.