Diamonds might be a girl's best friend, but Sparkle's Diamonds Sputtering technology looks to cozy up to videocards in an attempt to offer better heat dissipation.
The company today announced the new technology, which it says consists of outfitting the cooling fins on videocards with a Diamond-like Carbon (DLC) membrane. According to Sparkle and its R&D team, DLC offers high heat conduction capable of dissipating heat much more effectively than copper alone.
"The diamonds do heat dissipation four times faster than copper, it relies on the phonons which is produced by the crystal lattice vibration, to bring heat to lower temperature places," Sparkle wrote in its press release. "Diamond-like Carbon can achieve both functions at the same time, that is, transferring heat to lower temperature places with both graphite metal bond and diamond insulation bond (the covalent bond)."
It gets even more technical and goes on to discuss the process of Plasma Enhanced CVD (PECVD) to plate the DLC membrane on videocards, but the end result is a 5C temperature reduction on a 9500GT, according to Sparkle. But don't hold your breath for diamond-cooled videocards any time soon. Sparkle admits the technology carries a "high" cost and is still mulling over bringing DLC to market.
Sweden-based BitTorrent indexing site Pirate Bay goes to trial today in Sweden on accusations that the popular torrent site has helped millions of users illegally download copyrighted material. If found guilty, Frederick Neij, Gottfrid Svartholm Warg, Peter Sunde Kolmisoppi, and Carl Lundström could each receive up to two years in prison along with a 1.2 million kronor (just over $140,000) fine.
At least two of the defendants don't seem too worried about the trial, and during a webcast news conference over the weekend offered a defiant message to the Swedish court.
"What are they going to do about it? They have already failed to take down the site once. Let them fail again. It has its own life without us," Ward was quoting as saying by TorrentFreak.
The court will also hear a civil claim being brought on by Warner Bros., MGM, Columbia Pictures, 20th Century Fox, Sony BMG, Universal, and BMG. Collectively, the companies are seeking 120 million kronor ($14.3 million) to compensate for alleged lost revenues.
"It does not matter if they require several million or one billion," said an also defiant Peter Sunde. "We are not rich and have no money to pay. They won't get a cent."
We'll continue to follow the trial as it unfolds, which, according to the prosecution, is expected to last for 13 days.
With the recent release of Nvidia's GTX 285 (single GPU) and 295 (dual-GPU) videocards, ATI's performance crown has been under siege. But according to chatter around the web, the GPU maker is set to respond with a new videocard in a couple of months.
Specifically, VR-Zone claims to have confirmed ATI will release its HD 4890 in April. The new card is expected to use the RV790 core and would appear to put to rest an earlier rumor stating ATI plans to name its new card the HD 4970. As currently spec'd, the HD 4890 will come clocked at 850MHz with GDDR5 running at 975MHz. The current RV770-based HD 4870 runs at 750MHz (core) and 900MHz (memory).
VR-Zone also says there will be two versions of the new card, a standard and OC edition. The standard edition is expected to launch in mid-April, with the OC card reaching retail by the end of April. if the rumor pans out, expect the OC edition to cost $299 at launch.
It appears as though the mobile sector is gearing up for a dual-screen revolution, or at the very least, we expect to see the concept start to become more readily available. Last month Lenovo introduced its two-screen W700ds Thinkpad, and now gScreen is seeing double.
Unlike Lenevo's W700ds, gScreen's G400 sports two full sized 15-inch LED-backlit displays. Graphics chores are handled with either an Nvidia Quadro FX 2700M or GeForce 9800M GT, both with 512MB of video memory. Other specs include an Intel Core 2 Duo T9600 (2.8Ghz) or P8400 (2.26GHz), up to 8GB of RAM, up 500GB of hard drive space, and the usual assortment of ports.
The company says it is also working on a ruggedized version called the TITAN M-1, which is "being built specifically to specs requested by the U.S. Navy for extreme environments." The internal hardware will be a bit different, not all of which gScreen is wiling to comment on, but did say it will come equipped with an Intel Core 2 Quad QX9300 processor, 4GB of RAM, and a 500GB hard drive and built to MIL-STD810F standards.
No word yet on price or ship date, but gScreen says customers can reserve the G400 starting February 25th through Amazon.com.
Everyone of late has big plans for the cloud, including Mozilla, who on Thursday launched an open-soure project called Bespin. The basic idea behind Bespin is to offer a web-based programming framework that brings together the speed of desktop-based development with cloud computing. While in very early form, Mozilla has set some high-level goals for the project:
Ease of Use - the editor experience should not be intimidating and should facilitate quickly getting straight into the code.
Real-time Collaboration - sharing live coding sessions with colleagues should be easy and collaboratively coding with one or more partners should Just Work.
Integrated Command-Line - tools like vi and Emacs have demonstrated the power of integrating command-lines into editors. Bespin needs one, too.
Extensible and Self-Hosted - the interface and capabilities of Bespin should be highly extensible and easily accessible to users through Ubiquity-like commands or via the plug-in API.
Wicked Fast - the editor is just a toy unless it stays smooth and responsive editing files of very large sizes.
Accessible from Anywhere - the code editor should work from anywhere, and from any device, using any modern standards-compliant browser.
As it stands now, Bespin 0.1 is just an initial prototype framework with support for basic editing features like syntax highlighting, undo/redo, previewing files in the browser, and other low-level tasks. In the long-run, Mozilla hopes to "empower web developers to hack on the editor itself and make it their own."
Developers who want to give the early prototype a whirl can access the Bespin demo here.
With IBM having recently announced it was building a supercomputer with 1.6 million cores capable of 20 petaflops of computing power, its hard to get too jazzed over a single petaflop. But for Europe, breaking the petaflop barrier is something that hasn't been done, but soon will be.
IBM and German research center Forschungszentrum Juelich are collaborating to build a new Blue Gene/P System supercomputer for Europe. It will mark the first time that a supercomputer capable of delivering petaflops of performance will be located outside of the U.S.
"With speeds over a Petaflop, this new Juelich-based supercomputer offers the processing ability of more than 200,000 laptop computers," explains Professor Thomas Lippert, lead scientist of the Juelich supercomputing center. "In addition to raw power, this new system will be among the most energy efficient in the world."
The Blue Gene/P System will house 294,912 processor, 144TB of memory, and 6PB of hard drive storage contained within 72 server racks. Adding to the historical significance, it will also be IBM's first watercooled supercomputer. IBM says the use of watercooling will result in a 91 percent reduction in air conditioning units that otherwise would have been required to cool the data center.
Some people harness the awesome power of Google Earth to view distant lands they may never reach, take in a crime in progress, or maybe even find a 3 billion dollar shipwreck. At least that’s the claim of Nathan Smith, a Los Angeles musician who appears to have spotted the remains of a Spanish barquentine while zooming in on a shoeprint shaped object in the Aransas Pass in Texas. This assumption was based on historical records which put a lost barquentine (three massed sailboat) near that location south of Refugio, Texas, in 1822.
After consulting with a few experts, he traveled to the location which just happens to be the private ranch of the late Morgan Dunn O’Connor. The result of this drama will end up being decided in the courts with the family of Mr. O’Connor and Mr. Smith in a bitter dispute over salvage rights. If the courts determine that the land is located within a navigateable waterway, the first person to find the wreck is entitled to the spoils, otherwise the bounty goes to the O’Connor family.
As if this wasn’t complicated enough, the state of Texas is also considering its options because it disputes the existence of a commercial waterway near the wrecks location. If this is proven true, the state might have found a surefire way to balance its books come budget time. U.S. District Judge David Hittner will rule on the salvage rights within two months time.
When you consider the complexity of modern day web pages, it’s actually a bit of a miracle that search engines work as well as they do. Dealing with duplicate links, especially off pages such as Amazon that may promote an individual product a thousand times or more has always been a challenge. Finally, after years of debate, Google, Yahoo and Microsoft are putting the past behind them to solve this age old issue. The solution is a simple tag that will be added to the standard link format called “canonical”.
The tag is designed to solve issues associated with multiple URL’s pointing to the same page, but may also be helpful when multiple versions of a page exist. Currently, the search engines employ a process that looks at the structure of URL’s to look for similarities. This generally works pretty well, but is far from perfect. It is considered to be somewhat rare for search engines to come together on any issue, but it isn’t unprecedented. In 2006 they joined forces to put unanimous support behind sitemaps.org, and in June of 2008 they jointly announced new standards for the robots.txt directive. Matt Cutts of Google and Nathan Buggia of Microsoft claim this new approach should help reduce the clutter on the web, and improve the accuracy of all search engines.
Even though these tags won’t completely solve all the duplicate problems found on the web, it should significantly enhance the indexing performance of search engines, particularly on e-commerce sites. The new tags will be discussed in depth at this year’s Ask the Search Engines panel at SMX West.
Gigabit Ethernet may still outrun all but the most extreme SSD Raid configurations, but researchers can never rest on their laurels. Always hoping to invent the next big thing, scientists now have their sights set on Terabit Ethernet to help quell our insatiable hunger for bandwidth. A team from Australia, Denmark, and China has combined their efforts to demonstrate terabit-per-second speeds using fiber optic cables, laser light, and an unusual material named chalcogenide.
The group documented the results of its most recent trial in a white paper published in the February 16th 2009 issue of Optics Express. Though the technology is promising, Ben Eggleton, research director for CUDOS (Center for Ultrahigh bandwidth Devices for Optical Systems), points out the current limitations. “The problem isn't injecting that much high speed data into an optical strand, called multiplexing, but retrieving data at such high rates”. Conventional electronics are capable of injecting dozens of 10 Gbps streams, but trying to retrieve these streams any faster than 40 Gbps is beyond our current capabilities.
The breakthrough here however isn’t in the speed itself, but in proving the concept.Until the processing hardware catches up with our transmission capabilities, you won’t be finding this in routers anytime soon. Eggleton speculates that these concepts can be adapted to achieve slower and more manageable results, but the goal of this experiment was simply to prove that it was possible using fully photonic chips built using the same methods employed by current CMOS circuits. "It's years to complete," Eggleton said, taking these research efforts into a production technology. But these demonstrations "are starting to establish this is a serious proposition."
It is a disgrace that humans haven’t still got the hang of setting passwords. It seems as though that most internet users have inextricably tethered themselves to a promise of not setting strong-enough passwords, which may force hackers to reconsider their choice of profession for its grueling nature. As you devour more of this story, you will begin to envy hackers for having it stroll-in-the-park easy.
A new study has revealed – rather reiterated - that internet users nonchalantly continue to set unimaginative, fatuous passwords. The study appraised 28,000 passwords that were recently stolen from a U.S website.
Sixteen percent of the users had set their first name as their password. Around fourteen percent chose easiest to recall key combinations, including “1234” and “12345678”. Other users, who apparently don’t rate their mathematical ability highly, chose to steer clear of numbers and settled for passwords such as “AZERTY” and “QWERTY”.
Five percent of the passwords were found to be inspired by popular things and celebrities, including names of movies, TV shows and actors. The strongest password in this category was found to be “Ironman” as it sounds impenetrable.
Three percent of the people reckon passwords are another medium of expression. How else would you explain passwords like “Iloveyou” and “Ihateyou?”