Blog
Jan 01 , 2019 / By :

Universal Music Group, a division of Paris-based Vivendi Universal, has just announced plans to revamp the way they sell music CDs in European markets. Rather than serving up every product in the same 20-year-old CD case design, Universal will divide their offerings into three tiers with different packaging and different prices.HangZhou Night Net

At the bottom end, a “basic” package will be introduced, consisting of a simple sleeve made of heavy cardstock. It should sell for €9.99, the same price Apple charges for a full album download from the European iTunes Music Store. The package sounds reminiscent of Neil Young’s Mirror Ball, among other cardbord packages offered up over the years, but it’s not a one-off specialty item. Universal hopes to ship some 100 million of these albums by year’s end, starting from the planned September introduction. The basic package is earmarked for older releases and other slow sellers in the traditional jewel case package, and will come with no liner notes or other extras. According to the Guardian, the no frills concept is a “acknowledgement” of the power of iTunes.

If extra features are what you want, you’ll be more interested in the premium “DeLuxe” packaging at MSRPs around €19.99. That version, aimed at collectors and gift-givers, will feature a beefed-up case and tons of bonus features, ranging from expanded notes and bonus tracks to DVDs filled with behind-the-scenes clips and live performances.

In-between these extremes comes the “Super Jewel Box,” a sturdier version of today’s brittle jewel cases, but otherwise much the same as anything you’d find at your local Wal-Mart today. It’s supposed to sell for €14.99, comparable to today’s average CD price, and will feature new releases and popular back-catalog items.

Universal claims to understand that digital downloads are the future, but then go on to defend the need to boost physical CD sales again. The company’s executive vice-president of international marketing and A&R, a fellow by the notable name Max Hole, says that downloading has renewed interest in older titles, and that all you need to do in order to sell those albums is lower the prices. “We can grow the CD market,” says Mr. Hole. “That might be a little optimistic, but we can certainly slow its decline.”

I can see some value in the new packaging options: premium bundles will always have a following, and my CD rack is littered with broken cases, so the Super Jewel thing sounds like a welcome upgrade. But while I enjoy Mirror Ball immensely and can appreciate the Earth-friendliness of paper packaging, I’m not convinced that pricing is the only problem to solve regarding slow back-catalog sales. Those albums just aren’t promoted, and there’s also a major convenience difference between downloads and plastic discs. Besides, if all you’re getting is a plain cardboard sleeve with cover art, and you just want to rip it to your iPod anyway, what’s your motivation for getting physical? There’s more research to do regarding the true drivers of the digital revolution, and whether it’s more about price, about convenience, or something else entirely.

Jan 01 , 2019 / By :

I was surprised to find that an article on the dry topic of crop yields was surprisingly compelling. It seems to have it all: new information pulled together from a collection of studies that suggests the current consensus was based on an outdated technique. The topic is politically charged, and the editors even let the authors get away with a pun in the title ("Food for Thought"). How does this all fit together? HangZhou Night Net

As atmospheric carbon levels go up and the planet warms up, crops are expected to be impacted in a variety of ways. Increased temperature and CO2 are expected to accelerate growth, but soil moisture will decrease, potentially counteracting these effects. Based on a number of studies, it was expected that these factors would largely balance out, with a slight decrease in crop yield possible. The new study addresses one of these factors: how CO2 affects growth and yield of crops. The authors note that the estimates of these effects being used in assessing the impact of rising carbon levels are based on experiments in enclosed buildings, where it's easy to control the atmosphere. But technology has since improved, and free-air concentration enrichment (FACE) technology allows crops to be grown in the open under controlled atmospheric conditions, more closely approximating real-world conditions.

What happens when you compare the results of FACE experiments with enclosed results? The enclosed experiments produce crop yields that are more than two-fold higher than those produced using FACE technology. This, in turn, suggests that the expectations for the future crop yields may be over-estimates. As the authors note, "This casts serious doubt on projections that rising CO2 will fully offset losses due to climate change." They do, however, wrap up on an optimistic note: crop plants have been selectively bred for a number of properties, and increased growth at higher CO2 levels may be as accessible to breeding as anything else. The same technology that allowed us to recognize the problem may be useful in breeding a correction for it.

Jan 01 , 2019 / By :

Some Apple rumors, like the iTablet, never die, no matter how much time passes or how little sense they make. At the other end of the rumor spectrum are ideas so obvious, like the flash-based iPod Nano, that it is only a matter of time before they become product. There are also rumors that no one believes right up until they happen, like the switch to Intel, and there are the rumors that make sense but never seem to happen. HangZhou Night Net

An Apple spreadsheet would fall into the last category.

Was it only a year ago that AppleInsider was touting "Numbers" as the next big software release from Apple Computer? Yes, it was.

Rumors that Apple Computer has been quietly developing its
own spreadsheet solution gained a dab of credibility this week as
sources pointed to a revealing company filing with the United States
Patent and Trademark Office. Just two days after requesting a trademark on the word 'Mactel,'
which seemingly describes the convergence of Macintosh design with
Intel hardware, Apple on June 8th filed for a standard character mark
on the word 'Numbers.'

It's a little over a year later and a new spreadsheet rumor is out,
except this time ThinkSecret is the messenger, the name is "Charts,"
and it's *confirmed* to be true.

Long rumored—or at least, assumed—to be in development,
sources say Apple is not planning on positioning Charts as a competitor
to Microsoft's Excel, but rather as a more consumer-friendly
spreadsheet application that can handle the needs of home users and
small businesses but not pretend to execute any of the more advanced
functions of Excel.

Presumably, Charts will allow the import of Excel spreadsheets—unless they have advanced functions it cannot pretend to execute. Beyond that, it's anybody's guess. Along with the obvious, ThinkSecret claims development includes nonsensical stuff like Address Book integration. Pricing, which currently is US$79 for iWork, is unknown.

So, what does Charts—if true—mean for Mac users? Not much.

To date, iWork has made no impression as an office suite. Of course, Apple stresses that it is not competing with Office for the Mac, but anyone who has used Pages could have told you that. The only way this could matter is if Apple did what should have been done three years ago when iWork first came out. Apple needs to make iWork free on new Macs, a true replacement for AppleWorks. That would be a rumor worth seeing come true.

May 08 , 2019 / By :

What happens when rich, poor, and middle-class countries get together to agree on future IP regulation? If your answer was anything but “gridlock,” you’re an incurable optimist.HangZhou Night Net

Last week’s WIPO (World Intellectual Property Organization) meeting was supposed to mark an important step forward for the WIPO Development Agenda, but instead left it standing in place. The Development Agenda began in 2004 when Brazil and Argentina introduced a proposal for future WIPO regulation that would make WIPO into something more than the international IP police. Future WIPO decisions would be guided by this Development Agenda, which seeks a balance between copyright holders and the public, especially in developing countries where access to IP (think patented drugs, for instance) is a huge concern. The Agenda would also make WIPO into more of a development body by directing the organization to provide technical assistance for developing countries.

WIPO agreed to adopt a Development Agenda and held three meetings on the topic last year. What emerged was a set of 111 proposals (PDF) grouped into six categories. Last week’s meeting of the Provisional Committee on Proposals for a Development Agenda (PCDA) was supposed to come up with a recommended list of proposals to present to the WIPO General Assembly in September. Coming up with proposals is simpler than agreeing on them, though, and the meeting ended up with little consensus on the most important issues.

The EFF attended the meetings and has posted both notes and transcripts on their web site (day one, day two, and day three). Progress was impossible, as the group could not even decide how to evaluate the proposals. On the last day, Brazil and Argentina both announced their withdrawal from the meeting due to concerns that the method of selecting proposals for recommendation had been unfair and that most of their core concerns had not been included. In the end, the PCDA punted, sending the matter back to the WIPO General Assembly.

Depending on what proposals are ultimately passed, the Development Agenda could have a significant impact on issues such as health care and the public domain in countries across the globe. It could also ensure that countries have some leeway in passing their own IP laws, rather than following WIPO decisions in lockstep. Finally, the Agenda would put much more emphasis on technology transfer and technical assistance designed to benefit up-and-coming countries who want to compete in the knowledge economy.

Whether the full General Assembly can come to more consensus than the PCDA remains to be seen. The September meeting will no doubt be contentious, as the EU and the US wield so much power and oppose many of the projected changes.

May 08 , 2019 / By :

The Seattle Post-Intelligencer just had a chat with Blake Ross of the Firefox team, talking over points such as the project’s success and how Microsoft motivated the whole project. Blake says that Microsoft’s lack of browser innovation in the absence of decent competition makes him “furious,” and that the Google tie-in is based more on the search engine’s quality than on any marketing agreement.HangZhou Night Net

But while it’s a good read, the interview is short on future direction. Let’s take a look at the published plans to see what’s coming up in the next couple of major Firefox releases.

First, there’s Bon Echo, aka Firefox 2.0. We have reported on that version’s progress a few times already, so let me just point out a few corrections and changes to the earlier plans. It has been said that Windows ME support would be dropped, but apparently not in Bon Echo. The “priority 1” platform support list includes Windows Vista/XP/2000/ME (but not 98), as well as Mac OS X 10.2 and up, and Red Hat Linux (no specific version). Other Linux versions are P2, meaning they will probably be supported but there are no guarantees.

Support for themed or branded builds is high on the wish list, indicating a desire to branch out to more distribution partners. Some functions currently handled by elective extensions, such as session resume after restarts and crashes, or on-demand spell check functions, are slated for inclusion. Syndication feed handling needs more work, as does the fit-and-finish of the overall application interface. The stated goal is to make Firefox look and feel like a native application across Windows, OS X, and Gnome environments.

Otherwise, not much seems to be changing. There are no major code overhauls here, and a greater focus on bugfixing than on performance improvement. In addition, the new database version of the bookmark and history systems have been shelved for now. That system and overall performance improvements are scheduled for Firefox 3 at this point, along with greater standards compliance, better security, and i18n internationalization support.

In general, it looks like Firefox 2 is a spit-and-polish job, designed to look and feel as professional as possible, and leaving the really big changes for the next major release. Along with the marketing-friendly branding feature and the “sizable chunk of revenue” the Mozilla project has amassed from search engine deals and the like, I think I can smell a nice, big marketing push alongside the final release of Bon Echo. Will we all be sick of Firefox TV commercials soon?

May 08 , 2019 / By :

Music industry lawsuits: they’re not just for the West anymore. The International Federation of the Phonographic Industry (IFPI), the international body charged with protecting music labels, has announced its plan to sue Yahoo China within the next couple of weeks over the search engine’s alleged links to sites hosting pirated music.HangZhou Night Net

Similar action was also taken last year against the leading Chinese search engine, Baidu.com. The IFPI’s threats are based on the claim that it is illegal to link to illicit material, a claim bolstered by a new Chinese law. Bloomberg has the details:

The federation is also considering using a new Chinese law that came into effect July 1 that fines distributors of illegally copied music, movies and other material over the Internet as much as 100,000 yuan ($12,500). As of today, Chinese search engines operated by Yahoo China and Baidu.com provide links to other Web sites hosting illegally copied songs.

The law says a Web site is jointly liable with the host of the pirated files for infringement “if it knows or should know that the work, performance or sound or video recording linked to was infringing.”

Though the explicit threat of a lawsuit is something new, IFPI chief John Kennedy used a May speech in Shanghai to indicate that his organization was prepared to take on more Chinese search engines over the issue of piracy.

It is clear that the ISPs are far from adequately supporting us today. I have been very disappointed in recent months to see some well-known brand names among the internet companies blatantly infringing our members’ rights. Baidu has already been found guilty of copyright infringement in the Chinese courts; China-Yahoo is now in a similar position, choosing to turn a blind eye to the infringements taking place on its service instead of setting the example of responsible practice which we would expect from them. We are watching China-Yahoo closely and will have no hesitating in acting to protect our members’ rights if we should have to.

Though such a lawsuit would hassle Yahoo China (not actually run by Yahoo, but by Chinese operator Alibaba.com.cn), the more interesting aspect to this story is what the recent legal cases say about China. The country has not always been a role model for the rule of law, nor has it always appeared interested in fighting piracy. The situation got so bad that the US threatened to go to the WTO and seek sanctions, but China has recently been saying and doing all the right things. The IFPI suits indicate the entertainment industry’s belief that legal threats now mean something in China, and that the Chinese market is poised for enough growth to make the legal effort worthwhile.

Such moves, coupled with statements from Beijing about the need for China to get serious about enforcement, suggest that the Chinese IP free-for-all may be slowly winding down. With piracy rates for music and software still hovering at 90 percent, substantial change will come slowly, but there’s little doubt that it’s on the way.

May 08 , 2019 / By :

July 4th approaches, traditionally the time the US celebrates throwing off the shackles of English Imperialism, with the consumption of large quantities of alcohol and the combustion of large quantities of explosives. However, not everyone is hoping for a big bang. NASA had been planning to return the space shuttle into the skies, and explosions are definitely not on the agenda. HangZhou Night Net

The space shuttle Discovery has sat on the launchpad in Florida for several days now, waiting for the all-clear signal, but first weather, and now safety inspections have been raising red flags. The problem remains the insulation on the large fuel tank that contains the liquid oxygen and hydrogen that fuels the shuttle's main engines. The tank is covered with insulation foam, and bits of this foam can fall off, damaging the orbiter on the way down. Such damage was responsible for the loss of the shuttle Columbia three years ago.

The launch was planned for tomorrow, July 4th, but NASA are meeting today to determine whether or not it is safe to proceed. There are also concerns over whether the repeated draining and filling of the tank is causing flexing that might be the cause of the cracks.

NASA isn't just worried about foam debris. Cape Canaveral is also home to lots of vultures, who feast on the roadkill that results from all the interested visitors. When a 1 kg chunk of foam can result in the loss of an orbiter, a bird strike involving a vulture weighing three times that is a serious problem. As a result, NASA have set a trap several miles away to keep the vultures clear of the launch site. The birds will be released once Discovery is on its way.

May 08 , 2019 / By :

When I think of lasers, I think of precision optical devices that take hours of careful alignment to optimize. The reason for this is that stimulated emission is usually much weaker than spontaneous emission and without all the feedback created by carefully placed high reflectivity mirrors there would be no laser. This description, while accurate, ignores the overall physics behind the laser, which is best seen when looking at random lasers. HangZhou Night Net

Random lasers have been around for a few years now and a recent article in Journal of Applied Physics has given me an excuse to write about them. Instead of creating your laser from a nice optically transparent medium surrounded by mirrors, you have simply blast a powder with energy, causing the particles to glow. The trick is that each particle in the powder will reflect light in all sorts of directions and between certain pairs of particles there will exist a path for the light to travel back and forth as though the particles were mirrors. Between the two "mirrors" the light will travel along a complex, narrow path that is much greater than the straight line distance between the two. Thus, in contrast to a conventional laser, where a large volume of material must be excited, only a few atoms take part in the laser process so it is relatively easy to get those atoms in the correct state to lase. Since all the atoms are ready to lase, the mirrors at the end points don't need to be all that good. The best thing is that in any powder this occurs for huge numbers of particles so you end up with more than one laser. This paper uses zinc oxide embedded in a polymer matrix, which holds the powder in a fixed formation, making it much easier to work with. Zinc oxide, when sufficiently excited, lases in the ultraviolet region, where normal semiconductor lasers will not operate. Zinc oxide also has the potential, through some trickery using nonlinear optics, to produce a huge range of colors from the ultra violet out to the mid infrared – a spectroscopist's wet dream.

These laser sources are the very antithesis of normal laser development. No care in construction is required and the system is incredibly robust. If you shake the thing up and down a bit – something that will cause an ordinary laser to hemorrhage – all you get is a group of new lasers. Of course, since the laser is random, no one really knows in which direction it will emit. Since only a few particles make up each laser, the power doesn't really scale to very large numbers. However, this isn't so bad since many laser applications don't require high power but would benefit from reduced cost. The biggest problem, however, is getting these things started. At present a huge pulsed laser, the likes of which would not fit in your CD player is required, which is kind of a problem for those thinking about monolithic applications.

Apr 08 , 2019 / By :

Back in the middle of June, Mike Pinkerton checked multi-line textarea spell-checking into the Camino trunk and branch nightly builds.iWebUserSites is a website directory of pages made with Apple's iWebPeople have long pined for a text editor that hides all distractions and let them get down to work. Since a manual typewriter just isn't practical these days, Hog Bay Software has created WriteRoom : "For Mac users who enjoy the simplicity of a typewriter, but
live in the digital world. WriteRoom is a full screen, distraction
free, writing environment. Unlike standard word processors that focus
on features, WriteRoom is just about you and your text. Requires Mac OS
X 10.4 or later." HangZhou Night Net

Speaking of cool and innovative applications, Liquifile is a file browser for OS X that has a unique visual representation of your documents: "An alternative Finder, if you will. It is designed for visual thinkers
who want to get a bigger picture of their files and feel them more
directly. But not only for them ;-). Did you ever wonder why searching
and browsing is so separated in today's interfaces? Ever wished you had
a bigger screen when finding or organizing your files? Ever lost files
in deep hierachical structures? Ever wondered what the next big thing
in file browsing might be…?"

If you doubted Apple was working on a resolution independent UI before, you might be a little more likely to believe it with Apple's HIGeometry API that popped up on June 28th: "HIGeometry is a Quartz-compatible API for describing and manipulating
basic geometric objects such as points, rectangles, and sizes.
HIGeometry expresses all coordinates using floating-point numbers. This
API provides functions to convert an object’s coordinates into a
different coordinate space. These functions support resolution
independence mode drawing by taking into account the scale factor of
your application’s user interface."

Bleep's Buildfactory makes the process of managing, checking-out, and compiling multiple Xcode projects super-easy. BF seems as if it picks up where Xcode and Apple left off: "BuildFactory is a tool aimed at helping developers do what they do
best, develop. With BuildFactory you can run automated builds, build
refreshed sources from Subversion, open errors and warnings in your
external editor of choice (even Xcode), and build multiple projects
with the click of a button." Bleep recently donated a bunch of licenses to the Adium crew!

Apr 08 , 2019 / By :

Since it's an extended holiday weekend here in the US, science news has slowed down considerably, allowing me the time to write up a post I hope will bring some clarity an issue that's popped up more than once, including just last week. The topic is junk DNA, namely does it exist, and if so, how do we identify it? I'm going to go over a few of the types of potential junk DNA and then wrap up with a my own conclusions on the topic, just so I can refer to this in the future without re-starting the debate over whether junk DNA really is junk. HangZhou Night Net

Read more …

The concept of junk DNA arose when researchers started sequencing large pieces of the genome, and found that very little of it coded for proteins. With the completion of the human genome, we can give a more precise figure on that: only 1.4 percent of the genome is likely to code for an actual protein. But other genomes tell a very different story. For example, the pufferfish (Fugu rubripes) has a genome that's 1/8 the size of humans, but has roughly the same number of genes. The difference, it appears, is largely in the junk: Fugu doesn't have much of it, and so serves as a useful point of comparison. Comparing the two genomes can help us identify how junky all the bits of non-coding DNA are. There are a few classes of DNA sequences that appear to have a substantial junk content:

Inter-regulatory sequences: before a protein gets produced, you have to copy the DNA that encodes it into an RNA message, a process that is regulated by the DNA surrounding the message. The DNA sequences that regulate a gene's expression can reside up to hundreds of kilobases away from the actual gene. But does the DNA in between the regulatory sequences matter? In some cases yes, but Fugu suggests that those are the exceptions. Many of the same regulatory DNA sequences are used in both humans and fish, but in Fugu those sequences are often much closer together, with the intervening sequences eliminated. This suggests that much of the sequence near genes is junk.

Introns: In eukaryotes (all multicellular animals), the protein coding portion of a gene is split up into exons. The intervening DNA (termed introns) is eliminated from the final RNA message. All told, the DNA sequence of introns accounts for about 24 percent of the human genome. These introns contain regulatory sequences that signal for the elimination of the intron from the final message, and can contain sequences that regulate gene expression as well. But these account for a small fraction of the total intron sequence. Many organisms (such as flies and Fugu) have much smaller introns than humans, and small, rapidly dividing eukaryotes such as yeast have gotten rid of the majority of their introns.

Pseudogenes: Large duplications of genetic material go on all the time. Some of the duplicated genes (and their accompanying introns and regulatory regions) develop new functions, but others don't get used, and mutations eventually silence them. The human genome is littered with dead copies of genes, called pseudogenes. It's always possible that further mutation will do something useful with these genes, but in many cases, it's highly unlikely. In the case of odorant receptors, over half the nearly 1,000 present in the human genome are now pseudogenes; it's hard to imagine all of them being put to use in the future.

Disabled retroviruses and transposons: Many viruses reproduce by inserting a copy of themselves into the genome. When this process goes badly, an inactive virus is left behind; this process accounts for approximately 2 percent of the human genome. More significant are the transposons, mobile genetic elements that have hopped around the genome and now account for nearly half of it. Most of these transposon copies are non-functional, and will never hop again. Combined, these disabled parasites account for a significant fraction of what is commonly considered junk.

Other stuff: There are other regions of the genome that appear to simply not contain genes. Those regions are largely absent in the Fugu genome, and do not have any obvious function.

People tend to refer to all of these classes of DNA elements collectively as junk, which is where much of the problem arises. Although the junk is probably useless on average, there are clear exceptions. I've covered at least three cases where transposons or pieces of them have been used to form a functional gene product. These cases are often announced with press releases proclaiming something along the lines of "a new use for junk DNA is found." This tends to obscure the fact that these transposons are only useful within the context of a normal gene. Even in cases where the actual transposons may be doing something useful, it's far from clear whether any individual element or the huge number of transposons present are actually required for the useful activity.

So, when I refer to junk DNA, I'm not referring to any specific DNA sequence (which may or may not be useful), but to the collective populations of several types of DNA sequences that, on average, appear to be junk. By extension, I'd say that a lot of the genome appears to be junk. Fortunately, we've reached the point where we can begin to test this experimentally; if I'm wrong about much of the genome being junk, you may see a mea culpa here in the future.

Apr 08 , 2019 / By :

An article appeared yesterday at Space.com that discusses some of the odd things that pass for normal on a brown dwarf, the class of body that resides between "super Jupiter" type planets and stars. The article doesn't appear to be based on new research but it fascinated me, so I thought it was worth a mention. The article focuses on the work of two researchers who have published extensively on brown dwarves, and much of it appears to be based on a paper they published together back in 2002. HangZhou Night Net

There are noticeable differences between Jupiter and the Sun. Of the two, brown dwarves seem to be a bit closer to Jupiter in terms of behavior, as there are observational indications that they have weather and storms driven by convection (on the sun, magnetism is a major driving force). Temperature wise, brown dwarves start out much hotter than Jupiter. Without a star's fusion output, however, they gradually cool over time. This cooling has some bizarre implications: early in their history, brown dwarves have temperatures which are sufficient to not only melt metal, but evaporate it. Once the cooling kicks in, that metal will return to the dwarf's surface in a rain of liquid iron.

Part of this view into the world of brown dwarves is based on theoretical modelling, but direct observation has made some contributions as well. These dim objects are extremely difficult to observe, but inferences regarding their temperatures can be made. The researchers were somewhat surprised to find that older dwarves, which should be the coolest, actually appeared hotter and brighter than young ones. It seems that the clouds of hot metal initially act as an insulator, retaining heat for much longer than might be predicted. As these clouds fall to the surface as rain, more heat is able to radiate from the dwarf, making it appear to get hotter as it ages.

Overall, the article points out that, despite often being labeled "failed stars," brown dwarves are interesting objects with a distinct set of features.

Apr 08 , 2019 / By :

Google has strong words for legislators who are currently mulling over net neutrality issues: the company will take any perceived abuse to the US Department of Justice. Speaking at a news conference in Bulgaria, Internet pioneer and now Google VP Vint Cerf said that the company will be "happy" if legislators ultimately opt to support net neutrality principles, but in the absence of such support, the company will take a wait-and-see approach. HangZhou Night Net

"If we are not successful in our arguments… then we will simply have to wait until something bad happens and then we will make known our case to the Department of Justice’s anti-trust division," he said.

Cerf’s comments recognize one of the most heated debates within the net neutrality fight: whether or not this could become an antitrust issue. Some opponents of net neutrality argue that legislation is not needed because the market will sort it out&#151a notion founded on the belief that there is more than adequate competition in broadband across the country to prevent monopolistic behavior. Net neutrality advocates often argue the opposite, namely that there is not sufficient competition throughout the country to prevent monopolistic abuses. Without such competition, service providers could punish their competition with impunity, advocates say. "We are worried that some of the broadband service providers will interfere with that principle and will attempt to use their control over broadband transport facilities to interfere with services of competitors," Cerf said.

Recently Senator Ron Wyden (D-OR) announced that he will place a "hold" on any legislation addressing the telecommunications industry that lacks "effective policy" on net neutrality. "The days of unfettered, unlimited and free access to any site on the world wide web, what I call net neutrality, are being threatened," said Sen. Wyden. "Those who own the pipes, the giant cable and phone companies, want to discriminate on which sites you can access."

The war between the two opposing sides is filled with accusations and light on actual facts. Net neutrality opponents have said time and time again that premature action could lead to the stifling of innovation. Telecom lobbyist Mike McCurry recently warned that net neutrality laws "will dampen investor interest in building bigger, faster, smarter pipes," which he argues will ultimately lead to an Internet slowdown. Nevertheless, advocates of net neutrality continue to charge the telecommunications industry with ulterior motives, a charge which they believe "sticks" on account of now infamous comments made by several telecommunications executives over recent years, including AT&T CEO Ed Whiteacre’s claim that Internet companies were using "[his] pipes" for free, and that they should not be "allowed" to do that.

Apr 08 , 2019 / By :

Nanotubes. Lately, it seems like the solution to any problem we can think of lies in harnessing the capabilities of nanotubes—super-strong molecule-sized carbon pipes, which are believed to have potential uses in everything from transistors to tissue growth to infinitely rechargeable batteries. Yet even though commercial applications for nanotube technology have proven somewhat elusive, researchers are continuing to find more applications for the tiny things, and the latest mouthwatering tidbit comes from hard drive manufacturer Seagate.HangZhou Night Net

Seagate has filed a patent application for a design which would use lubricant stored in carbon nanotubes. As the drive spins, the lubricant slowly leaks out of the nanotubes as a vapor, keeping the drive running smoothly and happily for its intended lifespan.

The next question, no doubt, is why would Seagate want to create a hard drive that leaks lubricant?

Hard drive platters coated with a conventional recording medium are capable of recording data to a certain, relatively low density. Greater density is important because it not only allows more data to be stored in a given space, but data can be searched and read more quickly from a higher density platter. Unfortunately, increasing the density—outside of using techniques like perpendicular recording—also increases the instability of the data. In other words, placing ones and zeros too close together increases the likelihood that one bit may "flip" its neighbor.

One solution to this problem is to use a recording medium with a high magnetic anisotropy—which is much harder to alter magnetically, but would allow data to be packed more tightly with less risk of instability. The other side of the coin is that conventional hard drive heads are incapable of generating a magnetic field strong enough to write to such materials.

Heating high anisotropy materials makes them easier to record upon, however, and techniques exist for doing just that by aiming a tiny laser beam at the area under the hard drive head. In this way, the recording medium maintains its stability and greater data density, without requiring an exotic and expensive recording head.

The flip side to heating the platter surface, however, is that the all-important lubricant film is either evaporated or decomposed by the heat, which can severely limit the life span of a drive. Replenishing that lubricant is a tricky thing, and that’s where the carbon nanotubes come in handy. Like a high-tech sponge, the nanotubes can be made to hold a supply of the lubricant, which is then emitted as a vapor around the platter to keep things running smoothly for a very long time.

Researchers are continually discovering ways to keep the storage limits of conventional hard drives several steps ahead of the flash-based memory that is predicted to eventually replace them. Nanotube-stored lubricant is one more step toward keeping hard drives around for a long time. Before you head to the store to seek out one of these new drives, however, keep in mind that they probably won’t be on the market for a some time—if ever.

Speculation time: while the basic concept seems sound, the concern that comes to my mind has to do with the practical longevity of such drives. Focusing enough heat on the platter to cause changes in the lubricant makes one wonder what the long-term stability of that lubricant will turn out to be. Decomposed lubricant sounds suspiciously like dirt to me, and I have to wonder if, while replenishing lubricant in vapor form might be helpful in the short-term, there may be performance tradeoffs down the road.