i agree, its not happening now, but it will happen soon, ray tracing scales linearly with the number of cores thats increasing every year now, its the obvious future choice. With rasterization( even a word??? lol ), theyre spending more and more time trying to get ray tracing effects that it will eventually exceed the processing time of an optimised ray traced scene. You also get free hit detection and collision with ray tracing which is gonna become more negative in regards rasterisation rendering I love the idea of seeing enemys from mirrors Just a point on increased resolution, there are many techniques for increasing ray traced resolution not requiring 1 ray per pixel( simple interleaving can increase resolution 3 fold ) Youre right, however with all the hoops and loops that devs go through to make current games look good at least ray tracing is nigh on completely linear, thus increasing cores increases performance. Id much prefer a world like that where you can add a new card and precisely double performance Ive been ray tracing for 18 years now from the early days of Imagine 1.0 on the Amiga, im looking forward to the future
Yea don't get me wrong, raytracing is awesome. It definitely has it's benefits -- especially in the shadow department. It's like 10 lines of code to calculate real time shadows for a whole scene as oppose to having to code per object in a raster environment.
Thats all well and good when you're rendering a ray-traced scene or objects in a scripted, pre-rendered scene - not so good for a game though, where you will need collision detection, physics, non-prerendered / scripted movement etc
well the ray tracers are just engines, take povray for example, you set up the scene and it generates the picture, its no different from placing a few characters and buildings in a scene using the Unreal 3 engine. With ray tracing theres no subsequent calculation required for collision detection because all the paths of the rays are followed you get collision detection for free unlike how its currently implemented with hit boxes etc. In fact moving characters is much easier to do with ray tracing and more realistic because you dont have clipping problems Fair enough physics need to be calculated separately along with the rest of the stuff but that only takes 1 core Im not saying its gonna happen tomorrow, but it will eventually, simply because it scales well across cores and would be far more efficient that SLI and Crossfire. Intel are investing a fortune in it so im guessing theres something to it I think it could really add a lot to gameplay
I understandf that. But from a resolution scaling point of view, and to get the free collision detection, you have to scale the amount of rays used as resolution scales, hence my earlier point. While interleaving can increase the resolution, is it not the case you would lose the ray data that you would use for the collisions? Its certainly exciting to see the possibilities. I just wish we had the hardware to make it plausable.
It still amazes me people on these forums still complain about vista 'bloatware'. With DDR2 mem prices hitting rock bottom at the moment and good dual/quad core CPUs on the market for peanuts that ovrerclock extreamly well there shouldnt be anyone here who cannot get vista to run smoothly. When XP came out foums were filled with people complaining about it needing 1Gb to run smoothly, and back then memory prices were a joke. Its part of the evolution of PCs. If software didnt demand more resources, games demanding more GPU power the market would stagnate. The big disappointment IMO is the pre-built 'gaming' PCs you see in retail shops. They are a joke. But peopl earent stupid in general. Doing a little research before loading the credit card with these oversized paperweights would stop the production of these kind of maxhines. Its only through the ignorance of the consumer that these machine still get sold.
I just read that article again, he was getting 100FPS with the QuakeRT engine @ 1280x720 which is damn impressive. Definately if you increase the resolution you have to increase the number of rays, but at the same time at least if you double the resolution you know your framerates will half, at the moment if you double the resolution god knows what the performance impact may be. However you could increase the resolution as i said with interleaving without the huge hit of increasing the number of rays but at best you could only do 1 ray per 3 pixels otherwise it would look crap, you wouldnt lose the ray data all it means is that the representation of what you see doesnt match 1:1 with the rays being calculated, its less accurate( simply because its no longer 1 ray per pixel ), but only slightly and at the same time only affects the view/and anything colliding wth it I find the whole thing fascinating and im really looking forward to where this is going because stuff like rendering hair( calculating hair collisions ) and fur and cloth with physics works extremely well with ray tracing The hardware would be great to have now alright , but it'll be here soon, i rendered a scene on my Amiga in 1992, it took 26 hours per frame and took 2 months to render the 30 frames( i didnt have it rendering all the time ), i can render the same scene now in a few mins lol
Lol, so a scene you made 16 years ago, still cannot be rendered in real time on a VASTLY superior machine? Sure, the rendering time had dropped significantly, but nowhere near the level needed for games. Its all very interesting, and I cant wait to see where it all goes, but I fear I wont really ever get to play a real-time, properly ray-traced game in my lifetime. The 100Fps he had was with the older QuakeRt engine (based off Q3). In fairness, it looks a LOT worse than what we have currently using rasterising. (low poly models, low res textures). Some of the shader effects look nice done through RT in the ID tech 3 engine, but nothing jaw dropping
I don't know much about QuakeRt or Alex St John. Some say I don't know much -period. He made some valid points, of which he obviously knows quite a bit. I did like the sound of "ray-traced worlds with highly realistic physics". However to say "Vista blows" is quite an aggravation to me. I'd like to see what fully featured and mainstream compatible operating system he uses... Let's face it; fossil fuel might not be fantastic but it's still the best solution the human race has at the moment. ...and fairplay to those that mentioned WildTangent.
yep, but we're limited by Moores law, however you could have realtime raytracing now, just hook up a load of quad core PC's in a network and have 1 synchronizing them, expensive though . Also the pic had 16 or 18 levels of recursion for the light reflecting which would be far too much for games I doubt that, honestly, because we've been limited by Moores law before( because we've been using single cores for ages and mainstream dual cores have only come out in the past few years and software wasnt very mature and still isint to use them fully ), however because the ray-tracing problem is completely linear you can hook up a load of processors or cores, halving the time taken everytime you double the number of cores/processors. Intel is planning 80+ core chips in 2015 if i remember correctly. Sun also have funky processor design( havnt got a clue what the name was ) where they can be joined together like lego( and also stacked ) so you can add more and more cores as you need them Yeah i didnt get a change to read the article fully at work, its still very rough, but the spider was cool
nvidia gpu + ageia ppu on a single gpu = allot more head room to work and play with if you know what i mean, and add up that nahlem direct gpu to cpu access function and let the real world interaction begin (finally hair and full cloth movement, not to mention fully destructible environments, etc) ...with dx11 that is lol ^^
^^ lol didn't know that, but i 've played on it back then so the history is kinda repeating its self a? lol
And what about us DDR1 people? Our memory still costs over 2x more for the cheap kind and at least 4x more for the good stuff. You can't just abandon such a big part of all users.
lol, good man, i love the old days of the Amiga, every ew months they were getting more colours or bigger sprites or more parralax, it was fantastic, now its just $hit
when vista appeared i definitely agreed with this guy...but now after some time of optimization and sp1...i would say that i don't agree with him...vista runs great...yeah you lose 3,4,5fps in dx9 games...but this is going to be fixed...and the loss isn't that big so you can say..OMG i can't play games on vista!!!..
Vista still sucks donkey balls imo. The constant HDD activity is REALLY annoying with my WD Raptor 1500's, and the activity never stops. I waited for a few hours.... nothing, still humongous HDD activity.
I have to say I'd blame your raptors, or if you have a raid array - the controller. I do not, and never have done, experienced constant HDD activity. Are you sure you dont have restore points scheduled constantly, or some other program scheduled to be runnign constantly (AV for instance?)
^ Are you using 32bit or 64bit?, cos until i fixed it on my vista 64 ultimate it drove me nuts If you want to get rid of it, stop superfetch, readyboost, shadow copying and switch of all the scheduled stuff, its in admin tools somewhere( sorry, i dont boot vista anymore so cant tell you exactly ). Anyways theres a load of crap there that scheduled to run very regularly. Do all that and it should solve your problems