IDF 2013 Photo Gallery

22

Comments

+ Add a Comment
avatar

Philippe Lemay

A tablet doesn't need much of a power brick (it has a little wall-wart, but that's about it). I'm guessing that means that it's lower power consumption gives it more humble hardware requirements while charging.

If this could be transferred to low-power PCs and laptops... maybe we'll one day see those kind of machines get rid of their horrible power bricks! Take the Intel's NUC for example, the machine itself looked interesting, but it came with a horrible power brick almost as large as the "PC" itself!

avatar

Chris02141

How long did that poor tablet have to get fingered by that sketch-ball robot?

avatar

spectrex

I really can't wrap my around people that think mobile computing and tablets will take over and the desktop will fade away. Do people honestly believe that the work that goes on in a work environment can get done on a tablet? Try running a large database, photo editing and post processing, 3D modeling, CAD systems, Video Editing/ Processing, etc. on something like a laptop or tablet. There will always be a need for custom system with raw power and the ability to be able to connect them to other hardware such as in an industrial setting. Maybe the general consumer market is okay with tablets for email, facebook and media consumption and to be fair there is a lot of money to be made there, but this is not the only market that exists. The world simply can't function on just tablets and I don't think anyone is ready to commit fully to the cloud in the event of connection failures and more important, due to recent events security.

Why is everyone worried about overclocking competitions not being relevant? I remember not too long ago people tried to push the limits to 5 Ghz and now we have a 5 Ghz CPU ... it's always been about overclockers just doing it to see how far they can go with the knowledge and skill they have. It's just a contest and bragging rights, if you don't get it then maybe your not an enthusiast?

USB vs Thunderbolt? Maybe Thunderbolt has more uses in the professional industry than for the average person. Most people can't afford a Drobo either but they exists and many people use them. There have always been new tech introduced. Remember the AMR risers that used to come on motherboards? No... most people don't they were there but never caught on ... kind of like PCI-E 1X not many uses for it other than sound cards ... I would guess this to will go away some day in favor of something else.

avatar

gordonung

I believe the JEDEC spec calls for more 288 pins vs 240 pins, so yes. 

avatar

compguytracy

i guess my boring old 32 gb of ram is old news. i see the douches at all these events never invite me to break a world record event. ive ran my stock i7 3.4 to 4.5, on water. stable, so all you assclowns with liquid nitro are just Poseurs no2 is never practical in a custom build, lets see something that actually runs a os and a game? its like saying my car can run 250mph, with a bunch of work, but only good for 12 seconds on a track.

avatar

Hey.That_Dude

NO2 is poisonous so..... typo? LN2 or N2?

avatar

gordonung

I think you're missing the point on overclocking sports. 

Extreme overclockers that go for records are like people who drag race. They go for records and only worry about going straight. It's not as practical, as say, NASCAR of F1 but those are both very limited too. These are competitions. What you're talking about is everyday computing overclocking which is quite different. 

avatar

John Pombrio

Gordo, the writing on the wall is clear: Intel will not be pushing desktop CPUs any longer. Haswell is strictly a mobile/laptop CPU that just happens to be able to run on a desktop. I expect that Broadwell will be even less able to increase desktop speeds or specs, if at all.
Now, is this a BAD thing? With the GPUs becoming very powerful and having a transistor count higher than the CPU, they will be the new darlings of the desktop. Intel could also create chips like the -E series with 6 or 8 cores easily enough so that will EVENTUALLY be more important when multicores are better utilized.
Desktop CPU chips are getting close to their maximum heat dissipation rate so, as Gordon said, cranking up the clock speed via o/c will only yield marginal benefits over turbo.
Desktop CPUs has reached their limits and usefulness for being a major force for Intel R&D.

avatar

gordonung

I don't think anyone knows the future but I do know that talk of Haswell-E on desktop is a GOOD thing for power users. I really suspect that internal pressure to reduce consumers desktops (and I mean desktops, not mobile-based All In Ones or NUCs) to a single socket such as 1150 as it's not cheap to maintain a two socket strategy for Intel.

 

The good news here is Haswell-E exists and will likely come to fruition.

But let's be fair to Intel here: There's a lot of pressure on it to deal with ARM more than anything. I've said before and I'lls ay again: If Intel announced a new CPU that was 5x in performance with 5x price reduction for PCs, Wall Street and analysts would only ask what they have for tablets and phones. Wall Street and analysts see the computing future as tablets and handhelds.

 

 

avatar

tekknyne

supply-side economics ftw!

avatar

AFDozerman

So many negative comments.

avatar

warptek2010

Intel is still pushing this Thunderbolt crap. I predict Thunderbolt to go the way of the HD-DVD, Betamax... etc. USB 3.0 is fast enough and only getting faster. Message to Intel: No one wants to get rid of their USB peripherals anytime soon and go out and purchase something with a completely different interface bus system.

avatar

Innomasta

Apple is gonna change that buddy.

avatar

big_montana

How is Apple going to change that when wireless USB will ready next year. Wireless USB pushes 7Gbps, and is compliant with USB 2, 3 and 3.1. And USB 3.x will support 10Gbps. So tell me again why Thunderbolt is needed?

avatar

acidic

anything crapple does just happens to be the latest and greatest. afterall, they do invent everything, don't believe me ? ask any macolyte

avatar

vrmlbasic

Intel was running demos for USB 3.1?!! I'm sorry, that's too funny. Also, what's so special about USB 3.1 to VGA when we've had USB 2.0 to VGA for years?

2x the performance of an Atom is easy, anything can do it as 2 * 0 is still 0.

"While the robot fingers the tablet, a RED Mysterium-X records the response of the screen at 300 fps. " <--I've heard the phrase "tech porn" used before but never so, ah, literally.

avatar

gordonung

Well, not Intel, but the USB IF or USB Implementors Forum. Keep in mind, Intel did create USB though.

 

My opinion on USB 2.0 to VGA is that it sucks. USB 3 to DVI/VGA was better. I didn't get to see the demo, but the DisplayLink guys has USB 3.1 to 4K. I personally thought the USB>VGA dongle was cool because there's so many VGA projectors still in use so vendors feel they have to put VGA ports on notebooks.

 

I agree Atom's performance has been slow. So slow, I've wondered if the multiplier was set to negative, but 2x Atom gets to you fairly usable performance for Windows and on Android, it's performance they're not used to seeing. Keep in mind, Clover Trail+ Atom is still slow to us, but testing has shown that it's faster ARM. In other words, in the land of the blind, the one-eye woman is Queen.

avatar

aarcane

Pic 33: It'd be nice to know that ASUS plans to take a bunch of that R&D payoff from designing the UEFI and laying traces and use it to improve the UEFI on all it's currently supported boards and all it's future boards. Sadly, they probably just forked the codebase and won't backport anything new :(

avatar

Insula Gilliganis

- "the desktop PC and PC enthusiasm was alive AT (TYPO Gordon, you meant AND) well.. if you looked hard enough." That should tell you something right there.. if it isn't right out in the open, easy to spot, it must not be all that important to Intel!!

- you find something running Cinebench 11.5.. but no benchmarks to report??

- Gordon, nice caption you wrote for pic #7!!

- 7 of the first 8 pics are all about the tablets.. and "the desktop PC and PC enthusiasm was alive"?? Guess you aren't looking hard enough yet!!

- So when is USB 3.1 going to be on any chipset? Demos run on super high priced gear is nice but when is any of this going to be a reality?? Yeah, I know.. IDF is all about the future!!

- Pic #15.. makes me miss the blue surgical gloves editors use to wear when taking hand pics!!

- Pic #23.. impressive!!

- "192 gigabytes of DDR4 at 2133 mega transfers per second" --> Nice demo of something perhaps my grandkids will actually have available for them to buy at a reasonable price (BTW.. my grandkids aren't even born yet). And what exactly is "mega transfers per second"??

- 3960X, released November 14, 2011.. and.. 4960X, released September 10, 2013 (dates according to cpu-world.com) puts 667 days or 1 year, 9 months, 28 days (basically 22 months) between those releases. Intel would have to shorten that release interval by at least 6 months to get Haswell-E being released by the end of 2014.. better guess it will be released Spring of 2015. And then add another year for DDR4 to become reasonably priced!!

- Pic #33 --> With all the 5x that went into that board, so is it going to cost 5x what it does now??

- Sorry Gordon but didn't find all that much in your article that made me feel like desktop PC is not getting closer to being on life support.. lots of pics showing tablets, NUCs (is this the shape of the desktop PC?), external port devices, server/business gear, and liquid nitrogen overclocking.. sorry if not much of this excites me about the future of the PC as we know it. Perhaps I am not easily excited anymore. But thanks for the write up and pics!!

As for Innomasta's pin ?, the all knowing, all seeing Wikipedia says.. "DDR4 memory comes in a 288-pin DIMM modules, similar to a 240-pin DDR-2/DDR-3 DIMM. The pins are spaced more closely (0.85 mm instead of 1.0) to fit more within the standard 5 1⁄4-inch (133.35 mm) DIMM width, the height is increased slightly (31.25 mm/1.23 in instead of 30.35 mm/1.2 in) to make signal routing easier, and the thickness is increased (to 1.2 mm from 1.0) to accommodate more signal layers."

avatar

Hey.That_Dude

MT/s =[ bits/(clock cycle) ]* clock frequencies
Although to be honest I think he got MHz and MT/s mixed up as most DDR3 runs at 1866 MHz which would mean 3732 MT/s.... so yeah.

avatar

poee

I agree. It is disheartening to see the direction Intel and MS are going vis a vis the desktop PC. Gordon is clearly trying very hard to see the glass as half-full (even though the glass is barely wet at this point).

Liquid nitrogen overclocking records that have zero practical use? This is what remains of the "enthusiast" platform. Intel has make it plain what they think of PC "enthusiasts." What exactly is the reason why it takes Intel 12+ months to get a Xeon rebadged as a desktop "-E" chip? I'd love a play-by-play that explained why it is so very difficult and time-consuming for a company like Intel to affect such a drastic metamorphosis. (Oh, it's the same socket? Same pin-out? Same thermals? What? What?!)

avatar

Innomasta

Gordon, do you think DDR4 will sport different pin sets like the last generations?