ASUS Adopting G-Sync Technology in Upcoming Monitors

29

Comments

+ Add a Comment
avatar

Jademan

The good thing is that this technology can be used by choice. They are not forcing anyone to use it, they just make it available!

avatar

Gman3968

I can't believe this crap. I have heard about both the titan and 780 doing 4K, not to mention that the new R290X smoking both in a comparison. And now, they have to add something to the monitor. Why? The monitors can't handle the output? I don't understand why I need to have all that crap to see a game better.

avatar

Jademan

I use a candle for monitor! Yay, me!

avatar

Philippe Lemay

So, wait, that whole G-Sync thing won't work on old monitors? No offence meant to nvidia, but to hell with that. I'm not spending hundreds of dollars to replace my old screens.

V-sync may be "not as good" but it'll have to be good enough.

avatar

lordfirefox

I'm torn. On one hand I like ASUS monitors but on the other hand I'm an AMD user because of Eyefinity. I know nVidia has the capability of multi-monitor but only up to 3 screens whereas Eyefinity I can do 5 or 24 at 120Hz.

Oh well. There's always Samsung.

avatar

E3Sniper

"New line of monitors"

VG248QE is the monitor I'm using at the moment.

Maybe I'll be able to get an upgrade to include the chip :)?

avatar

Hey.That_Dude

Yeah. I'm not sure about this tech. When your GPU goes faster than your screen you'll get problems and dropped frames (not that big of deal but what will the chip do?) and vice versa when the gpu is too slow you'll get super stuttering and maybe even visual changes in the pixels.

Then there's the problem of closed environment. Sure it's nice now with NVidia at the top of their game, but when they fall and AMD is the clear winner for 6 to 8 months what then? Do I have to wait 6 to 8 months and postpone other parts of my build? When I want to use this with my Intel GPU on my laptop, what then?

Then there's data transmission... DP has the channel of AUX info, but the other standards don't. Is this going to be an DP thing only?

Seriously, this kind of bothers me because it's such a great idea. It would suck if we were forced to buy monitors based on the graphics cards we have or want.

Oh and my capcha for this post was: uj3fn

avatar

Sirian

They said that when you exceed the rate of the monitor it falls back onto a pseudo double buffered v-sync. Fortunately the monitor in question (my current display minus the Nvidia tech) is capable of 144hz and supports LightBoost which I personally think is the more capable new tech... honestly at 120hz I never even glimpse any screen tearing. Exceeding 144 fps is hardly an issue lol.

avatar

Hey.That_Dude

Just researched it. Sounds interesting. I just want an IPS that actually runs at 120 Hz with this and I'll be fine (at 4k, no reason to upgrade anytime soon otherwise).

avatar

Rift2

I rather watch the Refresh Rate of Brittany Vincent =)

PWM needs to be addressed before anything some monitors it's tolerable put a lot of people can't tolerate it. Higher refresh is ok but those 3-D monitors with 350 400 brightness is crazy totally unnatural and dangerous to the nervous system.

avatar

Hey.That_Dude

Pulse Width Modulation? What?

The only thing dangerous about bright monitors is the people who aren't smart enough to turn down the brightness in dark environments. That or they could add softened lighting reflected off the wall behind their monitor to help ease eyestrain. Either way that's a personal problem and comes with a different combination of answers for every situation.

avatar

Rift2

The added brightness bothers people who simply want a higher refresh but the higher refresh is only offered in 3-D monitors....

Turning down the brightness leads to PWM flicker unless you have one of those snazzy Ezios that reduce PWM or that really expensive Benq.

avatar

lordfirefox

Or you know you could download F.lux and let it turn down the temperature color of your monitor at night. It's less strain on your eyes. Unless you're working on color sensitive stuff like Photoshop. But then you can just disable it for that. I've found F.lux to be pretty handy even though sometimes I forget it's enabled while I'm playing a game and wonder why it's getting slightly darker than normal in Minecraft.

avatar

Hey.That_Dude

so what you're telling me is that the monitor makers can't make a clock multiplier for PWM... that seems unlikely, but I've been surprised before.

avatar

vrmlbasic

Unless this has 8-bit color (instead of their current 6-bit solution), works for AMD off-the-bat AND is at least 2560*1440 then there's no point in this.

1080 is dead.

avatar

jbitzer

How do you figure this?

Everything is in 1080. Oh, yeah, except HDTV still in 720 in broadast TV.

Other than elitists measuring their e-peen, 1080 is still quite standard.

avatar

vrmlbasic

Have you ever seen the TV depictions of Cuba where the Cubans are shown driving only 1950s cars in the modern age, despite the fact that those cars have long since been technologically eclipsed and are only driven by the Cubans because they aren't allowed access to the modern cars?

I contend that PC gaming exists in the same state: we have monitor tech more advanced than 1080 (fact) but mainstream monitor companies refuse to make it available to us and so we suffer on with decrepit 1080, still forced to see the individual pixels in our TVs and monitors. The second that QHD monitors came into being is when 1080 died. That was years ago.

avatar

Bucket_Monster

In the television/movie world though, you can't (currently) get media or broadcasts any greater than 1080p, so you could have an HDTV that could be capable of insane resolutions, but it wouldn't be used.

For a general computing/gaming monitor, yes I would hate to go below 1920x1200, but there's more to the world than just your desktop PC. Just because a tech exists doesn't mean it's being fully utilized.

Since you brought up cars, your viewpoint is sort of like saying cars fueled by gasoline are outdated and obsolete because electric/hydrogen fueled cars exist. The disparity between 1950's vehicles and modern day vehicles is not the same comparatively as 1080p and higher resolutions currently. Your 1950's car analogy would be closer to comparing ancient CRT monitors with modern day LCD's.

avatar

vrmlbasic

No, my analogy is valid as written. The 1080 LCD is the 1950s car. The 1440 LCD represents a newer car based on the same principles as the old, just evolved and years closer to perfection, not some hackneyed and derivative electric car.

"Just because a tech exists doesn't mean it's being fully utilized." <-That is exactly my point. 1440 monitors exist. We just cannot conveniently purchase them. As an aside, 1080 is dead in the film world as well as it is motoring towards 4K.

More to the point of this article, it is a crying shame that it is easier to buy a 120 or 240 Hz TV than it is to buy a monitor with the same faster refresh rate. PC monitors have gotten the shaft.

avatar

jbitzer

Actually, your analogy is more like 1080p is the regular car everyone has. 1440 are like the tesla roadster that you have to pay more for, special order, and really doesn't offer anything all that spectacular outside of bragging rights for the price premium.

Higher resolution != better technology, it means more pixels per inch whoopdie doo, you stuck a big block engine in a mustang, you didn't build a flying car.

avatar

OpTicaL

My IPS Foris FS2333 will do just fine, thank you.

avatar

yammerpickle2

When will they add it to a 4K 3D display.

avatar

wumpus

They can add it all they want. If the GPU can't update frames nearly as fast as the display can update itself, G-sync is pointless. I haven't heard of a GPU that can do that at 4k.

Note that it may make a lot bigger difference in Oculus Rift, but Carmack was strangely silent about that.

avatar

Gameaholic1337

Hmm the model number of the Asus 144hz montitor I just bought three months ago is also VG248QE. I'm really hoping this feature can be added to it with a driver or firmware update. Otherwise it didn't take long for my brand new monitor to be obsolete.

avatar

legendaryone66

http://www.geforce.com/whats-new/articles/introducing-nvidia-g-sync-revolutionary-ultra-smooth-stutter-free-gaming

Look at that you should be to able to mod your screen

avatar

Ch0plol

I'm almost 99% sure it won't be able to be added later. The G-Sync that they showed at their conference was an actual PCB device that is installed in monitors. Unless that PCB is already implemented in your monitor (which I highly doubt, as they just announced it yesterday) then you won't be able to get its functionality.

avatar

vrmlbasic

Maybe the PCB device will be present in your monitor but will never be enabled, like AMD's new sound chip on its current HD 7000 series Radeon card.

I think not having the PCB widget in your monitor would be better. BTW, isn't that one of those funky monitors that only has 6-bit color?

avatar

kevaskous

Isn't yours one of those funky monitors with horrible ghosting making eye strain when playing fast paced games or watching fast paced media?

avatar

Dwarf

Technology...makes you feel cool one moment; then stabs you in the back and walk away laughing. Treacherous creature that technology is. I'd stay away.