Intel's Dedicated Graphics - I'm skeptical...Jun 10, 2021
When Intel announced that they would be developing discrete
graphicscards about two years ago, I and well, everyone else took it with a grain of salt, I mean maybe improved integrated
graphicsor the company getting something ridiculously expensive to speed up the AI, that sounds pretty pretty believable, but a real GPU that I can hold well in my hands, as it happens to be in my hands right now and I fully believe that some really cool things are going to come out of this, but I'm still
skepticalthat Intel will ever turn this into A real product for gamers, even if I would love to be wrong, Pulseway brings you this CES video.
If you like to work while you are not at work, then you need pulseway, the software that allows you to remotely monitor, manage and control all your IT systems. from an app, try it for free and get 20 off your team plan when you sign up at the link below. This is not the first time that Intel has created a graphics card and the story of the dg1 begins many years before I was born in 1963. The Apollo mission was in full swing and NASA decided that the best way to visit the Moon was to have a module landing on the Moon while another remained in orbit, but those two modules would need to be mastered to safely reconnect before the plan could move forward. since if the two modules failed to reconnect, everyone died in order to properly train the astronauts on how to dock.
NASA created their rendezvous docking simulator and inside was the real-time imager created by ge aerospace, essentially the first graphics card of the next 30 years. ge aerospace would improve their simulator and at some point an engineer walked in there and said, "Guys, have you tried this? This is a lot of fun, leading ge aerospace to create real 3D and partnering with Sega, yeah, that one on model 2 and the model 3 to offer some mind-blowing graphics for the time, then in 1993, ge aerospace was sold to lockheed martin, where martin marietta also got hooked on video games and decided to double down on it by not only increasing its partnership with sega but also with
intel. create the most incredible graphics card the world had ever seen, codenamed project auburn, the hype over auburn was real and everyone assumed that the incredible resources and expertise at Intel's disposal would completely destroy 3dfx and nvidia until auburn hit the market in February 1998. 740 launched for just 34.50 and even at that price it sucked, the performance was abysmal compared to the 3dfx voodoo 2 and by the end of the year everyone had basically forgotten about it.
This error eventually led Lockheed Martin to abandon the project and Real 3D. When Intel closed, it bought the IP and tried to try again with the i 742 and i 744, but they never reached acceptable levels of performance and it did not reach the market. Some of this ip is still alive in Intel's current igps, which might explain why they sucked for so long and Intel's dreams of creating a sweet graphics card for gamers were crushed. Many years passed since Basher was released, yes, Halo brought first-person shooter games to the masses and Linus' voice remained exactly the same until 2008, when Intel announced the project. larabee, once again the Intel graphics hype was in full force with Intel claiming that its demo of an overclocked unit was the first public demonstration of a single-chip system surpassing a teraflop, but the stunt didn't impress nvidia, who called it marketing and peter. glockowski said it was like a gpu from 2006.
To get the full scoop on larabee, check out Linus' video where he gets down to business with a real sample, but how the story ends is why I'm still a little
skepticalabout Intel's current dg1. Tldw is that larrabee was a complete success, not as a gaming graphics card, but as a general computing processor. The project lasted for 10 years as xeon phi, a platform that allowed many CPU cores to be inserted into a single supercomputer and can easily handle massive calculations, was actually used to create the world's fastest supercomputer in 2013, so this move of
intelligence was like, oh guys, we're going to make a big psyc gpu, that's why when Intel announced that they were going back to
dedicatedgraphics technology in 2017 my first thought was well, that's going to be great in 2020 for AI researchers.
This sentiment was reaffirmed when Intel announced that it would be partnering with the US Department of Energy to deliver the world's first exascale computer and, shortly after, there were rumors that consumer GPU performance was poor, Which brings us to yesterday when I saw Intel representatives bring out a working sample of the Intel dg1 and at first I was excited until I saw it being used. Intel's sample was Warframe, which runs right behind it. We and I don't need an fps counter to tell me it's dropping below 60fps and I couldn't help but feel like we've seen this song and danced before a lot of hype, a prototype that's not quite there but looks promising. just so the consumer side of the project is preserved in the future, the thing is that i still think this is the most important demo of ces 2020, even if we never end up getting a graphics card that destroys radion or nvidia, since i think I'd really be missing the point: Intel's big advantage here will be the integration of its CPUs and what they can do for mobile devices.
The way turbo boost can squeeze every last megahertz of stability and every degree of cooling out of thin and light machines is incredible. that control of a mobile device's gpu and the tightrope of power and cooling available between the cpu and gpus simultaneously to give us the kind of gaming experience you currently get on big 15-inch laptops on two small ones 13 inches. -crushing or turning something like the alienware ufo project into a real beast, the best part is that it is the worst possible scenario. We've already seen a massive increase in graphics performance in Ice Lake and Intel claims that performance will double in Tiger Lake.
As Intel ships the dg beat the software makers this year, hopefully we'll see the performance of this increase a lot and maybe get something to compete with nvidia, but even if we don't it'll be an awesome couple of years to play in low light and lights, see, that's what you can't tell the difference between when I'm working and when I'm just writing fan fiction and movie reviews that bothered me because of the pulse shape. pulseway is the application that allows you to monitor and manage your systems remotely in real time, so that from a single application you can scan, install, update your systems by sending commands, you can deploy custom scripts and it is compatible with Mac, Windows and, of course Linux, which makes your life easier than it could have been. possibly imagined, so don't wait, check out Pulseway.
You can try it for free at the link in the video description and also at that link you can save 20 on your team plan, so thanks for watching and be sure to subscribe to find out. how well Intel is doing in the future because I'm very excited to see what comes out of all this, see you
If you have any copyright issue, please Contact