This is good news for strategy fans who just love their consoles. I got to play Battle for Middle-Earth 2 for the 360 at E3, and was pleasantly surprised at how well it stacked up to the PC version. There's no reason the 360 couldn't handle the visuals and amount of units—the 360 is a powerful system with a good amount of RAM—but the real issue is that RTS games on consoles have never been able to nail a control scheme that didn't feel utterly useless in relation to the mouse and keyboard.
The reviews are in, and it's positive across the board. While a lot of people are saying that the controls are bit complex, I'm not surprised that a game like this has something of a learning curve. I'm glad to see the game seems to have lived up to its promise and delivered what may be one of the first really good RTS experiences on a console. Now, before you bring up Starcraft 64 or Goblin Commander, let me remind you of just how utterly terrible those games are. There's nothing like playing a game of Starcraft against a friend and having him be able to simply look up to see what you were planning.
While I'm usually all for innovation, I'm really hoping a ton of companies rip off EA's ideas for the interface. More RTS console games would be nice, and while I don't think you'll ever get better than a mouse and a keyboard for this sort of game, it's nice knowing that someone has succeeded in at least making something that's passable. I'm hoping to sit down and spend some time with the game myself, and when that happens I'll definitely post some thoughts.
eBay has announced sweeping changes at the top of several of its major subsidiaries. The catalyst for the reform appears to be PayPal president Jeff Jordan’s desire to “spend more time with his family,” and thus leaving the company this fall. Jordan might go back into business, perhaps rejoining the entertainment industry where he spent many years as CFO of The Disney Store and Hollywood Entertainment, as well as president of Hollywood division Reel.com, or expanding his role in Expedia.com, where he currently sits on the board of subsidiary Hotwired.com. If he does, we’ll know that “family time” is just another euphemisms for “you’re fired.”
Taking Jordan’s place, Rajiv Dutta moves over from his current position as president of Skype—another eBay subsidiary. He previously spent five years as CFO of eBay, and should be well qualified to handle the payment service’s operations. He’s facing new challenges from Google’s new Checkout service, though it’s quite possible that the threat is largely imaginary; Google Checkout is not a direct competitor in PayPal’s core market of person-to-person payments, and Google even wants to the Checkout service accept PayPal alongside Visa and American Express. In return, eBay has explicitly blocked the use of Checkout for auction payments. And so it goes.
To continue the game of musical Aerons, Skype’s VP of Products, Alex Kazim, moves up to the vacant president post, reporting directly to Skype founder and CEO Niklas Zennström. Kazim has been bouncing around the eBay organization in various roles for seven years, after running the show at video game studio IX Entertainment (makers of artistic success but commercial failure Golden Gate in 1996) and managing software development projects at Apple. Kazim seems like a better fit for this post than Dutta was, bringing more of an engineering focus than a financial one to a highly technical field, so these changes should work out to eBay’s advantage in the long run.
That leaves one more major executive change to report: Lorrie Norrington adds the presidentship of eBay International to her existing Shopping.com presidency, replacing Matt Bannick who moves into “eBay’s initiatives in corporate philanthropy and the developing world.” That move is effective immediately, unlike the others. She has held leadership positions at financial software maker Intuit, and spent 20 years climbing the corporate ladder at GE, where she ended up president and CEO of FANUC Automation. Her experience with global sales is unclear, while Bannick did bring significant international experience to the table. As a consultant with McKinsey and Company, he was part of the US dimplomatic staff sent to help unify Germany when the wall came down. In other words, it’s not clear that this is an upgrade.
Overall, however, eBay seems to be making some good moves here, making better use of the talents and experiences of its top staffers. Or in the company’s own words, “The changes will help the company leverage the expertise of top executives and further eBay’s efforts to build synergies among its brands.” I suppose buzzword bingo is okay as long as the company actually means what it’s saying.
Manufacturer: Dell (product page)
Product specs: Intel Core Duo T2300 CPU (1.66Ghz), 1GB RAM, 100GB hard drive, Intel GMA 950 (integrated graphics), WiFi, extended 9-cell battery
Price as configured: US$800 + tax (shop for this item)
Dude, you’re getting a Dell
The Dell e1405 is a brand-new entry in Dell’s Inspiron line of entertainment notebooks (it’s also marketed as the 640m on the business side of Dell’s offerings). With a 14" widescreen display, a Core Duo CPU, a 100GB hard drive, and a gig of RAM, the machine’s no slouch in the performance department. But when you can get a system like this—including an extended battery—for US$800, corners have to be cut somewhere. Right?
To find out, we put the laptop through its paces for a few weeks in an attempt to answer the burning question: can such an inexpensive piece of kit really stand up to computers twice its cost? Think of this review as dialoguing with our earlier investigations of two other Core Duo machines: the Thinkpad X60 and the new Macbook. How does the e1405 stack up? The answer might surprise you.
You never have a second chance to make a first impression
Dell has finally received the memo—you know, the one that Apple has been circulating for the last few years, the one that argues for the importance of style to consumer electronics. Dell, as much as anyone, typified the “beige box” syndrome that afflicted PC vendors for a decade or more, but the company now tries to atone for past design sins with sexier laptops like the e1405. Whether you like the design or not is a notoriously personal decision, but I’m willing to go on the record with my own opinion: it looks pretty good.
Is that an extended battery, or are you just glad to see me?
The new Inspiron lineup will never be as hawt as the MacBook Pro, for instance, with all that shiny metal and that smooth silver underbelly, and it will never be as svelte as the X60. Most components are made of a silver-speckled plastic and the machine weighs in at around six pounds with extended battery, but the overall effect is quite pleasing. Looked at from above, the machine has fine lines and an attractive silver/white/black color scheme. It won’t be the slickest machine at your local Starbucks, but you’ll look pretty decent without dropping more than a grand—no small feat. And compared to the older generation of Dell machines, the e1405 is like an 18th-century allegorical painting about the Triumph of Design.
Take a closer look at the picture above. See that one-inch bulge protruding from the back of the laptop? That’s the extended battery. To my eyes, the bulge actually looks pretty decent, more like a beauty mark than a goiter, but opinions may vary on this. If you plan on getting the extended battery, make sure the machine still fits in your laptop case.
Detailed tech specsIntel Core Duo T2300 CPU (1.66GHz)14.1" WXGA LCD (1280×800)1GB DDR2 SDRAMIntel GMA950 integrated graphics100GB 5400rpm SATA hard driveWindows XP Media Center Edition10/100 BaseT EthernetCombo CD-RW/DVD-ROM optical drive802.11b/g wireless card85WHr 9-cell LiIon battery
Download the PDF
(This feature for Premier subscribers only.)
Sooner or later, all "old media" companies find themselves threatened by a site or phenomenon on the Internet. We’ve seen it happen with the music industry, TV, newspapers, and many others. Sometimes, it takes a while for the old guard to discover what’s happening—that appears to be the case with Yell, which calls itself the world’s largest yellow pages publisher.
The problem—from Yell’s point of view—is Yellowikis, a wiki-based business directory available in several languages and containing listings for several different countries. The directory publisher is accusing Yellowikis of "misrepresentation," maintaining that the site’s name "constitutes an ‘instrument of fraud.’"
At first glance, it seems like a case of an elephant feeling threatened by a gnat. Yellowikis has only been operating since January 2005, has around 5,000 listings, and is run entirely by volunteers. In contrast, Yell had revenues of US$2.4 billion during 2005. However, Yellowikis offers something a telephone directory publisher cannot: dynamic, customizable content. In contrast, once a yellow pages business directory is published, that’s it until the next edition.
Yell wants Yellowikis to pay damages and surrender the domain name, perhaps so it can launch a wiki-like service. As "Yellow Pages" is a trademarked name in the UK and Yellowikis refers to itself as "Yellow Pages for the 21st Century," the small wiki may find itself embroiled in an expensive legal fight.
Even if Yell wins or forces a settlement, it won’t change the fact that the business model of selling advertising, printing it in gigantic phone books, and dropping yellow pages directories off on front porches is endangered. Many directory publishers realize this and have developed an online presence that mixes paid placements in with search results. Others, like Verizon, are getting out of the yellow pages business altogether.
While we faithfully replace the old phone directories with the new ones we find next to our front door each summer, actually cracking one open to look up a listing is a rare occurrence. That’s what the Internet is for.
The Executive Branch’s Office of Management and Budget has just released a memo (spotted at SecurityFocus) that’s intended to staunch the flow of sensitive information that federal agencies have been practically hemorrhaging for some time now. Many of the recent high-profile stories involve either portables with sensitive data falling into the wrong hands (e.g., the VA laptop thefts) or some form of remote access (e.g., the DOJ incident), so the OMB has decreed that all mobile devices need some form of encryption and all remote access must be protected by two-factor authentication. Here’s the list of mandates from the memo:
Encrypt all data on mobile computers/devices which carry agency data unless the data is determined to be non-sensitive, in writing, by your Deputy Secretary or an individual he/she may designate in writing;Allow remote access only with two-factor authentication where one of the factors is provided by a device separate from the computer gaining access;Use a “time-out” function for remote access and mobile devices requiring user re- authentication after 30 minutes inactivity; andLog all computer-readable data extracts from databases holding sensitive information and verify each extract including sensitive data has been erased within 90 days or its use is still required.
The memo doesn’t specify the particular technologies that are to be used, but it does stipulate that the agencies have only 45 days to comply. One hopes that this is the first move in a more comprehensive government-wide infosec reform that will be unveiled soon, and not somebody’s idea of the ultimate solution to the government’s mounting security woes.
Laptop encryption: it’s a good idea for everybody
I’ve been on a big backup kick this past week, and I snagged a copy of Knox so that I could make encrypted backups to my iPod. I like a having portable copy of all my work with me in case my apartment gets broken into and my hardware gets stolen, NAS and all. (I’ve been paranoid ever since this happened to me once back in Boston. I also busted a guy trying to break into my neighbor’s apartment last month.) Just in case I accidentally leave the iPod somewhere, I don’t want the person who finds it to have copies of my Quicken database—hence the encryption part.
Anyway, in the course of my research I’ve found that laptops are an increasingly common target for identity thieves. If your laptop gets boosted from a coffee shop table or from the back seat of your car, the odds are pretty good now that the thief is just as interested in your Firefox cache and other sensitive data files as he or she is in the hardware itself. So if you’re someone who does online banking on your laptop, you should seriously consider some kind of encryption solution, like Apple’s Filevault, or Knox.
Filevault kind of scares me, because I’ve had to hard reset OS X a few times before and the homework I’ve done tells me that this might hose my account if I had Filevault on. I also worry about the potential performance impact of encrypting all the disk reads and writes, but from what I can tell this isn’t too much of a problem. I guess that makes sense if you’ve got enough RAM and you’re not writing to the disk that often. It might get painful during one of my marathon Illustrator sessions, though.
Right now, I’ve settled on Knox, especially since you can get it bundled with Pathfinder. Knox is nice, because it creates an encrypted disk image that you can mount as a volume. You can then move your sensitive files to the Knox volume (called a vault) and work from there. The mounted file vault scheme doesn’t address browser cache issues, but there are other solutions for that.
Personally, I’m using Knox for making encrypted backups to my iPod and NAS box. I mount a secure vault and back up to it with my backup tool of choice, Chronosync. Knox itself then handles backing up the image to the iPod when I dock it, and Chronosync backs up the image to the NAS on a weekly basis.
I haven’t yet taken the plunge and started encrypting my regular data yet, but that’s the next step.
So now that you’ve heard about my laptop security setup, I’d be most interested in hearing about yours. Drop into the discussion thread and share your tips for keeping your laptop’s sensitive data out of the hands of thieves in the event that those thieves get their hands on the hardware itself.
As our ability to study other stars has improved, the signs of planets orbiting them have increased. We can directly observe planet-forming disks and the indications of planets carried by the light of the star they're orbiting. But we have yet to take a look at one of the planets itself, and instead have to rely on computer models based on the average properties of the material that surrounds such stars. The limitations of this situation should be obvious. Not only are the models based on a single sample (our solar system), but the average properties of a system may hide important details. For example, the four rocky inner planets in our solar system can just about squeeze into Jupiter's Red Spot, so on average, planets in our solar system are a lot like Jupiter. Most of the planets we've detected elsewhere are pretty similar. Meanwhile, we're really hoping to identify things that look more like earth.
To do that, we're going to eventually have to start looking at the planets themselves. The problems with that are nicely laid out in an article that proposes a workaround. The biggest problems are of scale: at the distances of even nearby stars, the planets are nearly on top of them from our perspective, while the stars themselves would be expected to be 1010 times as bright as any nearby planets. This effectively rules out looking from earth, meaning any solution will have to involve an orbiting telescope. Two telescopes orbiting in concert might do the trick via interference patterns, but we seem to have enough trouble maintaining one such observatory, so that's less than ideal. A single telescope with a precisely shaped insert designed to block out the star's light (called a coronagraph) might do the trick, but the optics would have to be extremely precise, and we wouldn't be able to do any real-world testing on it until it was actually in space.
So, are we basically stuffed? Not entirely, according to the article. The workaround proposed is to make the coronagraph external to the telescope. Although this would also require two orbiting platforms, the second would have to do little more than deploy a 30-50 meter sheet, and then coordinate its position with the telescope. If the positioning can be worked out properly, then fine scale adjustments could be worked out based on actual experience. Even if it fails completely, the expensive part of the observatory—the telescope—should still be able to perform normal observations. Calculations in the article suggest that it could detect everything from the habitable zone of a star outward, and would work at distances of over 30 light years, making a significant number of stars accessible.
Everyone knows about the little rating symbols they see on their games, but very few people think about the process that puts them there. The fact is the ESRB rates games in a very passive way: by watching videos of what the developer considers to be the most graphic content of the title. If the developer or publisher doesn't put it on the video, the ESRB doesn't know it's there. The logistics of actually playing through every game and seeing everything there is to see would be impossible, so this is the only way games are going to get their ratings. Well, unless they secure a much larger budget and start paying people to play games and look for nudity—a job I'm all sorts of down for, if they decide to go that route. They know how to reach me.
Disclosing everything that's in your game has become very important after Hot Coffee and the re-rating of Oblivion. We've covered the need for full-disclosure on Ars before. Everyone's looking for the next big story, so I'm guessing there aren't a lot of developers out there who would be willing to try to sneak something by. The ESRB is also trying to make their point by fining people who don't play nice. Still, I don't think we've heard the last of this issue.
It's a tough situation. No one has the time or the money to play every game and look for all the instances of violence, language, sex, drug use, and everything else we have to keep from the minds of impressionable children. On the other hand, simply looking at videos of games being played by others and having your content be controlled by people who are obviously trying to go for a Teen rating to increase their sales is just begging for trouble.
Stephen Hawking is one of science's quintessential iconic figures. When he opens his mouth to speak the rest of us shut up and listen. Thus it was with great interest that I noted he had published a new article. It goes without saying that Hawking talks of nothing less than the origin of the universe, however, unlike other papers on the subject which I have tried to present here, I could actually understand some of this without too much effort. That is the sort of thing I really appreciate.
Theoretical cosmology usually works in the same way that everything else in physics works. Figure out the initial conditions, figure out the fundamental physics. Apply the initial conditions to the fundamental physics and out falls all the answers you could hope for. Unfortunately when this approach is applied to cosmology, one ends up with a finely tuned universe, where the initial conditions must be very precisely defined to get the universe we observe. The cause of this problem, according to Hawking, is that we are essentially in the middle of the experiment and are thus in no position to determine what the result from any initial conditions are. His solution has been to figure out how to solve the governing physics without initial conditions, but instead use what might be considered later conditions to help constrain the system. Combine this with the quantum nature of the universe and what you get is the wave function for the universe, which doesn't require a special state to get the way we observe it to be.
In some respects, this looks a lot like the anthropic principle at work and there is no problem with this, since we are not selecting boundary conditions that lead to what we observe. Better yet, the no boundary conditions approach naturally leads to inflationary periods such as that experienced by the early universe and the one we are currently experiencing. Not only that, this approach should lead to predictions on the structure of the cosmic microwave background, which should distinguish it from other competing approaches.
The following article was authored by Charles Jade.
After years of allying itself with third-party developers challenging the dominance of the iPod, Microsoft apparently has had enough. The failure of partners like Creative Technology, whose Zen players are a distant second to the iPod line, has spurred Microsoft into creating its own hardware.
While details are sparse—not even a name—the Microsoft player will be in the stores in time for Christmas. What is known is that it will boast a larger, "more advanced" screen than the current video iPod, and that it will have WiFi, presumably for downloading content directly to the player. There is also a pitch towards "Connected Entertainment" and social networking, possibly in conjunction with Xbox Live and/or Windows Mobile handsets. Adding web browser functionality would seem a more concrete feature and an obvious plus, but perhaps the biggest feature of this initial model isn't in the hardware.
To attract current iPod users, Microsoft is going to let you download for free any songs you've already bought from the iTunes Music Store. They'll actually scan iTunes for purchased tracks and then automatically add those to your account. Microsoft will still have to pay the rights-holders for the songs, but they believe it'll be worth it to acquire converts to their new player.
Considering the number of songs sold thus far through iTMS, that could literally be a billion dollar feature, and more than anything else shows how serious Microsoft is. Microsoft is also negotiating with music and television executives for content, though the latter have apparently not yet committed… but they will. As for Microsoft's current partners, such as Creative Technology, Samsung, Urge, Real Networks—who cares. They failed. The question is now whether Microsoft will fail. Probably. The iPod is entrenched and will be dominant for the immediate future regardless of what Microsoft does. However, Sony and Nintendo were entrenched in the game console market once too…
When a Community Technology Preview (CTP) build of Windows Vista was released to the public earlier this year, it contained a new tool called Windows Performance Rating, designed to analyze your computer’s various subsystems and return a performance rating between 1 and 5 that indicated how well Vista would run on that machine.
Feedback and criticism of the first iteration of this tool has caused Microsoft to implement changes in the way the utility works. Firstly, the app has been renamed “Windows Experience Index,” which for some reason makes me think of Jimmy Hendrix, but the name change is designed to make it clear that it is less of a benchmarking suite that tells you how fast your computer is, and more of an indicator of how well Vista will work.
The final result has also been renamed “Base Score” to clarify that it represents the lowest score of the series of benchmarks, rather than an average result. The base score is still a number from 1 to 5, but in theory, faster computers available in the future will be able to score a 6 or even a 7. In the first version of the tool, the final score was rounded to the nearest integer, but fans of decimal places will be pleased to know that the new version retains both significant figures.
Reaction to the new tool is mixed. Intel is apparently less than pleased with the utility, stating officially that they “continue to work closely with Microsoft to shape and influence [the tool], but have no further comment at this time.” Unofficially, however, an Intel source has complained that the WEI is “very heavily focused on graphics performance.” That’s bad news for the maker of integrated graphics chipsets, which typically score poorly on these sorts of benchmarks. The fact that the base score is the lowest of the bunch means that a beefier CPU or faster RAM won’t show up on the final tally. While Intel’s latest chipsets are perfectly capable of running Vista even in its most advanced graphics modes, nobody wants a computer that scores a 3 when your neighbor has a 5.
Computer manufacturers also have some reservations about the tool. According to one source, “one problem area makes an entire machine look like it has problems, which will have manufacturers confusing the issue by spec-listing sub-scores for each component or engineering computers to play well to this test.”
Companies like ATI, on the other hand, are pleased with the tool. “It should be very clear to everyone how important graphics are,” said Andrew Dodd, a software product manager at ATI Technologies. “As long as Microsoft makes it clear what each rating means and why they are getting that rating, it is a very good thing for end-users.”
So will the Windows Experience Index boost sales of graphics chips from companies like ATI and NVIDIA, while harming integrated graphics chips from vendors like Intel? It doesn’t seem terribly likely. The big selling point of integrated graphics is the lower price they bring to the computer, and price-conscious customers are not likely to care about whether Vista gets a 3 or a 4. Still, one wonders if the WEI rating might start showing up on retail displays as an additional selling point once Vista ships. The fact remains, however, that one low score in the series of benchmarks will still negatively influence the final rating number, which is a concern for manufacturers.