Hate on the interface all you want, but the fact remains that Lotus Notes remains the second-most-popular enterprise groupware application around. One reason for its popularity is that it provides a full-featured client for Windows and Mac OS X, making it attractive for companies with nonhomogeneous environments. IBM is about to add Linux to the list, as a full version of Lotus Notes on Linux will be released shortly.
For years, Linux users who wanted or needed to use Lotus Notes have been forced to run it under WINE or in a virtual machine. That will soon change, as Lotus Notes on Linux will be released for Red Hat Enterprise Linux v4 update 3 with full support from IBM. By year end, Big Blue will also support SUSE Linux for Enterprise Desktops 10 as well as the upcoming Novell desktop Linux.
Lotus Notes for Linux is not a fully native port, however. It is instead the first version of Lotus Notes to run entirely on IBM’s new Eclipse Rich Client Platform, a piece of middleware derived from the open-source Eclipse Foundation project. According to IBM senior offering manager Arthur Fontaine, using Eclipse allows IBM to write once and run anywhere: "because it handles the integration with the operating system, applications written to Eclipse have a native look and feel but are cross-operating system by default."
The next major release of Notes for both Windows and Mac OS X—coming in 2007—will also use Eclipse, meaning that Notes "will run with complete equivalence" across all three platforms.
I used Lotus Notes for nearly 10 years and like many other users, I had something of a love-hate relationship with it. I would even venture to say that the needle was more towards the "hate" side of the gauge most of the time. Much of that was due to the interface, which has its own exhibit in the Interface Hall of Shame, but I also found it to be bloated and very resource hungry.
That said, it does the whole groupware thing quite well, and adding complete Linux support may strengthen its hand in the groupware space. No one believes that Lotus Notes on Linux will result in mass migrations from Windows to Linux on the corporate desktop. However, some large companies are reevaluating their entire desktop strategy at the moment, deciding whether to migrate to Windows Vista down the road or cast their lot with Linux or even Mac OS X. For those companies reliant on Lotus Notes, being able to migrate their desktops while keeping users on the same groupware may tip the scales in one direction or another.
Hardware manufacturer Ricoh has developed a new optical drive capable of reading and writing several formats, including HD DVD, Blu-ray, CD, and DVD. The device will be demonstrated at the International Optoelectronics Exhibition later this month, and a read-only version could potentially be available to OEMs by the end of the year. A write-enabled version, which will require a more powerful laser than conventional optical disk recording devices, will hit the market at a later date.
In various optical disk formats, the data layer is positioned at different distances from the surface of the disk, a factor that contributes to the difficulty of supporting multiple formats in a single drive. Ricoh’s new optical drives use a special diffraction grating that can adjust the laser so that it can target the correct depth for each format. According to a Ricoh representative, this is the first drive that can handle all four major formats:
“This diffraction device is the first one that is ready for four formats, including BD and HD-DVD,” according to a spokesperson from Ricoh. “It will make it possible to build players and recorders ready for all formats, which will benefit consumers.”
Although pricing information is not yet available, it is likely that these new combo drives will be fairly expensive. With Blu-ray disk players selling for close to US$1,000, it will take some time for next-generation optical recording devices to become a mainstream option for desktop computers. Will we see PCs with a superdrive that supports four formats in the next couple of years? It is certainly possible, but with the rapidly decreasing cost of magnetic storage and the ongoing roll-out of fiber-optic bandwidth, the next generation optical formats could become an anachronism before they become ubiquitous.
Towards the end of June, I covered a few discoveries regarding HIV's ability to cause havoc with the human immune system. The studies provided some new information regarding the virus' interactions with the immune system, information that may prove useful in the future. But at the moment the data mostly serves to deepen the confusion about why this virus is so devastating to humans.
So the thought that more study could lead to greater confusion was on my mind recently. It came up again as I listened to the press conference on a report from the National Academies of Science. The report gave an evaluation on the current state of knowledge in reconstructions of recent climates. If you listen to the audio of the press conference, there's a long question-and-answer session towards the end. One question, however, leapt out at me.
The question focused on the Medieval Warm Period, a time of relatively high temperatures in Northern Europe that coincided with the westward colonization of the Vikings. More detailed studies of samples from around the world, however, have recently suggested that the situation is much more complex. Although some regions of the planet warmed during that time, others did not. Those areas that warmed also did so on different schedules. Combined with large uncertainties in the data, the NAS panel was forced to conclude that the evidence for the Medieval Warm period was ambiguous at best, and that any such even did not represent a period of global climate change.
This really bothered the questioner, who made a series of statements and questions that distilled down to the following: how could we have spent all that money on the studies and wind up knowing less than we appeared to when they started? The scientist who fielded that question seemed to have a hard time of it—after all, how do you distill an inevitable aspect of science down within the context of a press conference?
There are two potential ways of portraying this to the public that have come to my mind since. The first is to simply return to a statement that's become a bit of a cliche: the best scientific results raise as many questions as they answer. In this way, knowing that the Medieval Warm Period wasn't global is a good scientific result, as it raises the question as to whether those regions that did warm shared something in common. A second way would be to point out that the work was a success in that it allowed us to recognize the complexities of the situation. In the US, a war on cancer was declared in the early seventies, and billions of research dollars have been spent during the intervening years. But decades went by during which most of the progress involved thee recognition that cancer was not a single disease, but rather a complex array of diseases. Have those billions been poorly spent? As we are now seeing the first targeted therapies based on this understanding, it's easy to say no. Whether that could have been said so easily a decade ago, however, is less clear.
Stepping up its campaign against illicit file-swappers, the British Phonographic Industry (BPI) has moved from targeting individual users to putting pressure on their ISPs. The BPI has just announced that 59 accounts suspected of large-scale piracy have been reported to two ISPs, which are expected to deal with the issue.
17 requests went to Tiscali, while another 42 were sent to Cable & Wireless. The ISPs offer no guarantee that anything will be done, but the BPI wants to move faster against suspected file-swappers than is possible in the court system. They also want to paint the ISPs as complicit with the swapping through their own inaction. As they put it, “While the BPI retains the right to pursue cases against individual uploaders, the move against ISPs who have so far failed to take effective steps to stop illegal filesharing marks a significant development in the BPI campaign—allowing the record industry to deal with a greater volume of cases more quickly and efficiently.”
Looking at their court records, one can see why they would prefer to change strategy. The BPI has filed only 139 charges against file uploaders—merely a drop in the bucket. Only four of these cases have actually proceeded to trial, while another 111 have been settled out of court for a few thousand pounds each. Given the amount of resources that must be devoted to finding and prosecuting such individuals, it makes more sense for the organization to forget about trying to extract money from defendants and simply concentrate on shutting them down.
The BPI now gathers IP addresses and supporting documentation on egregious offenders and simply turns it over to the ISPs for action. This means that the BPI does not know the identities of the people they target, and they are not trying to get a subpoena to find out. In a press release, the BPI indicates that all they care about is getting these users to sign “undertakings agreeing to stop unauthorized filesharing.”
Whether the new, more pragmatic approach bears fruit largely depends on how the ISPs handle the matter. Neither Tiscali nor Cable & Wireless will be excited about devoting resources of their own to sifting through complaints from the BPI, but neither do the ISPs want to be portrayed as looking the other way while users hoist the Jolly Roger on their networks.
Parallels Desktop 1.0 for Mac OS X
Developer: Parallels (product page)
System requirements: Any Mac with an Intel CPU, Mac OS X 10.4.6, 512MB of RAM, 30MB free drive space
Price: US$79.99 (US$49.99 through July 15)
Move over emulation, virtualization is in and it's hotter than two Jessica Albas wresting the devil himself in a pit of molten steel. It's no contest, virtualization has it all: multiple operating systems running on the same machine at nearly the full speed of the host's processor with each system seamlessly networking with the next. Add to that the fact that it's cheaper than getting a new machine and you have the guaranteed latest craze. Not even the Hula Hoop can stop this one.
Okay, virtualization isn't totally new–it's just new to Macs and Parallels Desktop is the first out the door with a 1.0 product for Mactels. For those that are just getting to the party, here's a bit of a breakdown on virtualization. The idea is that program acts as a virtual machine (VM) and its job is to be the PC (one of the more boring drama classes), tricking the client OS into thinking it's inside a real x86 machine with a physical hard drive, keyboard, Ethernet card, etc., when in reality, it's merely grabbing unused CPU cycles and RAM inside another OS to do it's thing.
The benefits are pretty clear over a real PC: It's running on the Mac you know and love but you're not sacrificing access to the occasional Windows-only app that you might need. Maybe you have a copy of Office XP for Windows and don't want to shell out for the Mac version. Sure, you could load up Apple's Boot Camp, but using a program like Parallels–or its competitors VMWare, WINE and MS' Virtual PC–means you don't have to reboot just to use that accounting program at work.
It is a great prospect and now even Apple is recommending running Parallels on their Get A Mac site:
That's the corporate equivalent of Jesus endorsing your sandals. Click for high res.
When that page went up, the price of Parallels not-so-coincidentally went up from $50 to $80, so let this be a lesson to us all: never say "wow, that's so cheap" on a public forum again. Still, that's still cheaper than the $129 charges for the Virtual PC standalone package and if it works as advertised, it's hard to compare the two. Parallels promises to be a big upgrade from the pokey and painful Virtual PC emulation. So let's see if it's the cheap and fast hydra PC we've all been waiting for.
Minimum requirementsAny Intel Mac (doesn't require a machine with VT-x support) A minimum of 512 MB of RAM, 1 GB recommended 30 MB of available HD space for Parallels plus enough room for the VM OS OS X 10.4.6 Test HardwareMacBook Pro 2.0 2 GB RAM OS X 10.4.6 / 10.4.7 (both tested)
Download the PDF
(This feature for Premier subscribers only.)
Unless you have have a lot of gaming friends it can sometimes seems like you're the only person who spends a lot of time with your portable system. Portable gaming systems are like Backstreet Boy albums. Bear with me here, this is going to make sense. See, we keep reading about millions of people buying these things, but you almost never see one out in the wild. Luckily, at this weekend's Indy Ars Meet, a number of attendees whipped out their DS Lites for some drunken Tetris.
I haven't spent a lot of time playing the DS Lite in multiplayer, and it's amazing how well the single-cartridge multiplayer works. We played Tetris, New Super Mario Bros. and Point Blank. The gaming was great: Jacqui has some pretty decent Tetris skills, and even though we only had one copy of these games, everyone was able to play. Nintendo's system for getting the data to other systems and then playing is simple, quick, and painless. We had a great time and talked a ton of trash. Tetris remains a great time and the multiplayer New Super Mario Bros. can turn nasty quickly.
As Nintendo and Sony go ahead with their portable systems, it's worth noting that as more people get interested in portable gaming, the multiplayer aspect of these systems are going to become an important selling point. We've certainly come a long way since we hooked up our original GameBoys using cables in the school cafeteria (what, you didn't do that?) and Nintendo really seems to have multiplayer nailed on the DS Lite. It was great to see how many games only needed one cart to let other people get in on the fun. This is a trend I want to see a lot more of.
It’s been so long since I’ve seen something move out of the infamous “three to five years away” category and into actual volume production that I can’t actually remember the last time it happened. But Freescale is making it happen now with magnetoresistive RAM (MRAM).
Freescale just announced that they’ve gone into volume production of a 4Mbit (256K x 16-bit) MRAM chip. The newly announced MR2A16A MRAM part, which has been in development for a decade and which sampled back in 2004, already has attracted a number of buyers and should start showing up in products before too long.
Back before Freescale was spun off from Motorola, we covered a couple of Motorola demonstrations of working MRAM chips. At the time, IBM, Infineon, Motorola, and the legion of twenty or so other companies that have been working on this technology were projecting that the first MRAM products would hit the market in 2004. Freescale has beat the rest of the pack, albeit two years behind schedule, and I expect to see a steady march of MRAM product announcements in the next year as the rest of the memory industry begins bringing their own products to market.
The big picture for MRAM
I went fishing about for datasheets on NAND flash, and for devices organized as 16-bit words I found access times (read cycle) in the 50ns to 90ns range. This makes the new 35ns MRAM part quite a bit faster than current NAND flash. However, it’s nowhere near the 3ns to 5ns access time of DDR2 DRAM; 35ns is more on the order of old EDO SDRAM.
So while MRAM’s current access time and capacity won’t make it an immediate replacement for either DRAM or NAND Flash, both of these numbers will scale to make it a major long-term player in the storage market. For large categories of devices that are more sensitive to power than raw performance, MRAM will replace DRAM and Flash altogether. For the longer-term performance picture, MRAM well replace DRAM in all but some niche applications. IBM’s research shows that they can get access times down into the 3ns range, and a paper from earlier this year shows that they expect to see MRAM access times between 5ns and 20ns.
As for hard disk technology, I think it’s pretty clear that MRAM isn’t going to become the default medium for mass storage any time soon. I imagine that many types of portable devices, from laptops to media players, will eventually use a combination of MRAM and hard disk technology. A single large MRAM pool could combine the functions of both main memory and the kinds of backing store/caching technology that’s starting to make its way to market in the form of Intel’s Robson. This fast memory pool would be coupled with hard disk-based mass storage.
MRAM’s biggest medium-term impact will be on the Flash market. MRAM is superior to Flash memory in pretty much every way—it’s faster, and unlike Flash, it can take an unlimited number of reads and writes. As the rest of the industry starts to move into volume production of MRAM, the new technology is going to start moving into niches currently occupied by Flash and begin squeezing it out in some places.
Mike Shaw, ABC’s President of Advertising Sales, said this week that he would love to have the opportunity to shut down the “fast forward” button on users’ DVRs. Though he did not claim that commercial-skipping TiVo owners were thieves, Shaw is clearly unhappy with how easy it is to skip his network’s ads.
Shutting down the fast forward feature on DVR isn’t an easy proposition, though; ABC would require support from DVR makers, who could build their devices to recognize the presence of a broadcast flag. The flag would tell the DVR whether fast forwarding should be allowed during any particular show. But what DVR maker would want to do this? Shaw told MediaDailyNews that he “would love it if the MSOs [cable companies], during the deployment of the new DVRs they’re putting out there, would disable the fast-forward [button].” It doesn’t require a great leap of imagination to suspect that ABC is in such talks with cable companies right now.
From the network’s perspective, this is probably the best way to neutralize ad-skipping, which has taken its toll on network ad revenues over the last few years. Trying to talk a company like TiVo into disabling features that consumers want is probably a hard sell, but cable companies might be more receptive to the idea. Most cable operators also make money from local ad revenue, and would stand to gain revenue by making it harder for their customers to skip ads.
What about a customer backlash? In Shaw’s view, this is unlikely, as he believes that most people enjoy their DVRs simply because they can time-shift video content like a high-tech VCR; commercial skipping is just gravy. Take away the gravy, and customers should still be happy to eat their mashed potatoes.
“I’m not so sure that the whole issue really is one of commercial avoidance,” Shaw said. “It really is a matter of convenience—so you don’t miss your favorite show. And quite frankly, we’re just training a new generation of viewers to skip commercials because they can. I’m not sure that the driving reason to get a DVR in the first place is just to skip commercials. I don’t fundamentally believe that. People can understand in order to have convenience and on-demand (options), that you can’t skip commercials.”
The majority of DVR users do skip commercials, though, making this a potentially risky strategy for the cable companies. So long as competitors like TiVo and HTPCs exist, unhappy customers can always go elsewhere. Should TiVo, especially, ever fold, most consumers would no doubt settle for whatever is offered by their local cable provider.
Even as ABC pushes to halt fast forwarding, customer anger at TV advertising is growing. Though Shaw notes that “we’ve had the exact same commercial load for three years in a row,” he’s been hearing more complaints from customers about these intrusions into stories they are trying to watch. While no doubt irritating to consumers, who would skip commercials if they could, there’s no denying the fact that network TV shows are only free because of that advertising. Finding the right balance is proving to be tricky.
In Europe, by contrast, some television services are actually encouraging more use of DVRs. While ABC seeks ways to restrict DVR functionality, satellite provider BSkyB wants to offer more of it to its European customers. The company just announced a plan to allow remote recording of TV shows by using one’s mobile phone. Users can set their home DVRs by texting to a special number or by accessing a program guide right from their handset.
Nearly everyone has been affected by Google and its sheer ubiquity—to the point where “to google” has even become an officially recognized verb. Google gained this dominance by providing the best web searching service, which required the ability to quickly “crawl” the web, finding and indexing all the content it could get its hands on. Anyone who has set up a web server and peered at the access logs know that the Google spiders come quickly and often.
One thing that most people weren’t aware of, however, is that Google is indexing more than just text and images. The search engine is also capable of indexing and searching binary files, a feature that the security firm Websense has been taking advantage of to uncover malicious and hacked web sites all over the world.
The company utilized a little-known feature of Google to search for binary strings representing Windows-based worms such as W32.Bagel and W32.Mytob. “They [Google] actually look inside the internals of an executable and index that information,” said Dan Hubbard, senior director of security at Websense.
Websense plans to share the code they have developed using the Google APIs with other security researchers, but does not plan to release it to the public. Hubbard fears that would-be virus authors could use the tool to jump start their activities. “Instead of buying them on the black market [an attacker] could search for them and download them on his own,” he said.
Hubbard isn’t the only one concerned about the possibilities of searching for binary patterns on the web. Claudiu Spulber, of the Homemade Computer Tutorials blog, pointed out that hackers could embed common search terms into the binary, and then hope that users looking for a particular page would find a link to the program, click on it, and run the executable. The blog post includes an example of this, an illegitimate version of the shareware “Backup4all” program, but interestingly the malicious version no longer shows up on the results page. According to a spokesperson for Google, the company continues to keep an eye out for this practice.
Should Google continue to index binary files, despite the potential drawbacks? The company’s position is that the more things on the Internet that are searched, the better things are for everyone, and that people shouldn’t worry too much about any possible misuse. Still, the more powerful a tool becomes, the more the potential for abuse increases. This applies not only to Google, but to the Internet in general. As always, skeptical computing is the best defense.
Have you tried installing Windows Vista from DVD yet? If so, have you been successful in getting it to install? I ask because some Vista beta testers haven't had very good luck installing the recent CTPs. Apparently, some users have been setting their burner speed to "Max" and letting 'er rip. The final outcome? A botched install.
For most of us, burning disks at slower speeds is a no-brainer, at least if we want to ensure that all the data has been copied properly. Then again, with blanks being as cheap as they are, I'm always one to give Alcohol's "Write Speed: Maximum" setting a click and hope for the best. However, Microsoft is warning speed demons, myself included, to lay off of the gas. In the release notes of the latest Vista build, 5456.5, the company makes the following suggestion:
When burning your DVDs please do so at 1x or 2x and CRC them when done using the CRC utility posted on the Connect site. The customer experience improvement telemetry that we’ve been getting back on Beta 2 shows that not quite three quarters of setup failures are the result of a failure to read from the media. Testing shows that burning at slower speeds greatly increases the chances of a good burn.
Burning DVDs at 1x is like watching paint dry and 2x isn't much better, but if around three quarters of failed installs occur because of a bad burn, then the company should not only include the passage, but it should also place the words in big bold letters on the download page as well. As for myself, I haven't run into any of these problems because I have been using VMWare for all of my Vista installs. It has worked perfectly for every build, and I highly recommend it to anyone that can spare the resources. For those that have no choice but to install from the DVD, you might save some time in the long run burning at 1x (it hurts me to say it) rather than getting 95% done with the setup and having it hang.