YTread Logo
YTread Logo

How Apple Just Changed the Entire Industry (M1 Chip)

May 31, 2021
foreigner hello welcome to today's episode of Cold Fusion before we start this episode it is not a recommendation to buy any of Apple's products, in fact I am a Windows and Android user myself, the following is simply my documentation of what I perceive it as the greatest moment in the history of computing in the last decade. I think it's

just

fascinating, so make sure you're comfortable with this one, get out the popcorn, and get ready to be entertained or learn a thing or two. Foreign Fusion TV hates Apple these days and there is some reason why they have recently made their products impossible to repair, increased the prices of their products and accessories to a ridiculous level and enjoy excessive control, but what if Apple would really do something to boost the

industry

and stimulate competition?
how apple just changed the entire industry m1 chip
Well, it turns out that I may have made phones and computers never be considered comparable when it comes to power - they both live in different worlds - although, as long-time subscribers would know, back in 2012 I was comparing them and seeing whether a phone could replace a PC. The rate at which phone power improved year after year was absolutely incredible to me at the time, those were fun days. The main difference between phones and desktop computers is that they use different types of processor technology. Phones use a type of processor design called an ARM. While desktops usually use another type called x86 like Intel for example, arm has a simpler design than x86 making it more efficient but x86 traditionally used to be able to do more but this is changing.
how apple just changed the entire industry m1 chip

More Interesting Facts About,

how apple just changed the entire industry m1 chip...

The interesting thing to note is that in recent years our phones have gotten faster at a much faster rate than computers have gotten better, so here's a question: what happens when the

chip

s in our phones become come close to the power of laptops and desktops? What would happen if you took this incredible performance and efficiency from a mobile

chip

? and put it in the body of a laptop that is access to a larger battery and more room to breathe well things start to happen that many thought were impossible and then I didn't plug it in for four days totaling a little over 10 hours of mixed use and it still had 17 percent fewer animations, they are as smooth as you would expect and even opening and closing some of those larger apps like Chrome and Lightroom are as fast, if not faster, than on my desktop, which is sick , it's really a general situation. responsive, feel good, look at this guy, okay Angelica, when you're editing this, show the images of the Surface Pro It's extremely smooth and this is the base model fanless MacBook Air, these M1 Macs are shaping the future of Mac hardware as we know it's changing everything.
how apple just changed the entire industry m1 chip
The new MacBooks and the M1 chip that powers them resulted in a power-efficient chip as powerful as some desktop CPUs, what most people are missing is that this is no ordinary MacBook, the M1 chip represents The culmination of a decade of planning and a sea change in the consumer computing

industry

. On the surface, putting a mobile chip in Apple's best-selling computer seems really stupid, but when you look deeper, it makes a lot of sense, as you'll see. later, it's actually a classic case of disruption as defined by Clay Christensen in 1997. So how did Apple do so well?
how apple just changed the entire industry m1 chip
The topic will be the journey of how Apple got to this point. In today's episode, you will also see how the CEO of Intel refused to work on the iPhone and made one of the biggest mistakes in the history of technology and what Apple has done. First we need to go over a quick history. Central processing units or CPUs are the brains of computers. Doing all the billions of calculations per second to make our devices work. CPUs run on water, called instructions. These are the little bits of fundamental data that tell our computers what to do, for example, add these two numbers, retrieve this number from memory, that kind of thing.
When these commands are grouped together, they are called an instruction set. In the emerging battle between mobile and desktop CPUs, it would be the variations in the way these instruction sets were handled that would make the difference since their founding in 1968 by Robert Noyes and Gordon. Moore Intel has been the guardian of mainstream computer CPUs. They would get the name x86 after a series of popular chips in the late '70s and early '80s that ended at number 86. The x86 instruction set architecture would dominate the industry from the beginning. Chip makers of the 1980s began adding increasingly complicated instructions so that a non-suspicious buyer would perceive them as better;
Over time, these instructions caused the chip's functionality to bloat and the limited physical space on the chip to be taken up by sophisticated instructions that were barely. Meanwhile, in the United Kingdom, in 1983, a company called Acorn Computers decided to go down a path. different instead of making things increasingly complicated. Why not take a simpler approach? This way of thinking was called risk. A reduced instruction set Computing unlike the bloated Sisk of companies like Intel some people say that risk technology is the wave of the future other people say that risk is already an old-fashioned technology what does this expert think well?
Stuart, IBM's current line of PCs are based on a processor called the 8086. It's a bit of a messy architecture, so the risk is that this is a new starting point. We can say okay, here is a simpler, faster instruction set with less code space, so if you talk about risk, it's a philosophy. the way you design the architecture of a machine or it's a specific set of instructions, so no, it's certainly a philosophy or a style and in fact it's a style that differs depending on what you're trying to do. The main idea is not to add any complexity. to the machine unless it pays for itself based on how often you would use it.
From there came the acorn risk machine project, also known as the arm. This was the birth of the type of chips that would be used in all of our smartphones for decades to come. So in this episode, for the sake of simplicity, we will simply refer to the wrist chips as simplified CPUs and the Sisk chips as complex CPUs as we develop the first arm chip during testing. Something unexpected happened here. Professor Steve Ferber talks about how the arm project began when we set up the arm design, we really didn't expect to achieve it. Well, first we think that this idea of ​​risk is so obvious that the big industry will notice and it will be trampled.
We were hoping to get into this project and find out why it was that way. It was not a good idea to do so and the obstacle never emerged from the Mist. We kept moving through the fog until we finally had a fully functional arm chip in our hands. I don't think it was the first day, but a few. Days later we decided it would be better to measure the power consumption, so I turned on the arm chip, ran some code, looked at the ammeter and it was reading zero and I knew we had designed a fairly low power chip, but this was a bit noticeable. and it turned out that when inserting the ammeter into the power supply, I could not connect the power supply and therefore no current flowed through the ammeter, but the chip was still working, so wait so you are not alone you. he didn't connect the diameter actually it was that he didn't connect the chip to the power supply yes there was still no power supply connection to the chip and it was still working in 1985 they had their first prototype and in 1987 the first computer based on the arm The Archimedes acorn was produced and from here Armwood licenses their designs to other manufacturers to build them.
At first the low power consumption was a nice side effect but it didn't have much advantage for desktop machines but as portable computing devices began. appeared arm became the first choice, they were used in everything from Nokius to Ericsson phones, so how were these Arm chips so efficient? When we talk about computer CPUs, we measure them in gigahertz and that means clock cycles per second. Simplified CPUs like arm will generally do one simple instruction per clock cycle, while desktop chips can use many cycles to complete a complex instruction, this means more power consumption, less efficiency and more heat produced, multiply this by hundreds of millions of times per second and the problem is starting to become clear on some desktops.
In CPUs, some of the complex instructions can take a dozen clock cycles or more to complete, although, contrary to popular belief, both types of processes flourished during the 1990s in the early days. Simple instruction set designs were great for engineering and graphics, so much so that computers using the technology helped create movies like Jurassic Park Toy Story and simpler instructions. CPUs were also used in the 3D graphics chip of the Nintendo 64 and other early game consoles, but the desktop computer market safely belonged to Intel. This symbol on the outside says that inside you will find a legacy of technological leadership the Intel Pentium processor for the next generation of power compatible Intel the computer inside in 1991 Apple IBM and Motorola combine forces to create the power chip for PC Curiously , powerpc was a desktop chip but used a simple instruction set that Apple would proudly put in its computers during the 90s, the simple instruction ARM CPU would get a big boost in the form of the iPod in 2001.
Usage by Apple's part of the ARM chip in the iPod was a microcosm for the popularity of ARMs and portable devices, from cell phones to MP3 players. In 2002, there were a billion chips and devices despite using simple CPUs in their portable devices. Apple's powerful collaboration with IBM and Motorola was not going so well, its simple CPUs were falling behind Intel, it looked like Intel and its complex form of CPU. Thinking was the way to go for the desktop market, so Apple left Power PCs behind and yes, that's right, we were going to start the transition from Power PCs to Intel processors when Intel signed the deal with Apple in 2005 Apple was already thinking about what's coming.
Then they went to Intel to see if they could push the next iPhone. Now, at this moment a moment occurred that Intel would never forget. Paul Ottolini, Intel's CEO at the time, didn't see such a deal working because it was so outside his specialty. He believed that Intel could make enough money building mobile chips for Apple's new iPhone. The income was not going to cover the research costs and he also thought that he could not imagine the iPhone selling well. This obviously turned out to be a huge missed opportunity when the iPhone was launched in 2007 using a single instruction arm CPU, if Intel CEO Paul Ottolini had made the deal happen they would have had a massive stake in the iPhone business, but unfortunately for Intel, Apple was moving forward without them in 2008, Apple would sign an architecture license with arm.
In order to design their own chips from scratch, they would also purchase a PA semi, a company whose founder previously specialized in developing high-performance ARM chips. This was a key strategic acquisition for Apple as it gave them the experience and potential to design some of the best ARM chips and put them on the path to revolutionize Intel and its x86 processors in 2012 Apple launches its first fully custom designed CPU , would be codenamed Swift and was used on the iPhone 5. This was in contrast to offline use. arm rack designs, as it had previously, this new chip built by Apple was twice as fast as the previous chip, even for a first generation chip, it had impressive performance compared to the mobile competition, but not was the real surprise.
The arrival to the industry came the following year, in 2013, with the Apple A7 in the iPhone 5S. Apple's early adoption of a 64-bit architecture surprised everyone, even beating its own CPU teams by more than a year. Its proprietary 64-bit design wouldn't be seen until late 2014 in the form of the Galaxy S4, Apple famously called its 2013 chip design quote desktop-class architecture, most people thought it was

just

a Apple bragging, but this wording was actually a hidden clue to where the company was headed, as it soon will be. See, your latest custom chip really holds upon par with desktop PCs.
In 2014, there were 50 billion ARM chips in the world, while Apple was still selling a ton of iPhones and making a ton of money, and do you know what all this revenue can buy? a ton of r d the chips inside the iPhone continued to evolve and improve faster and faster while using less power by the end of 2018 with Apple's a12 chip it looked like Apple was catching up to Intel in this graph the gray dots are Apple and the blue squares are Intel in the last five years Intel managed to increase the performance of its flagship by about 28 percent at the same time that Apple managed to improve its designs by 300 and this is where things start to get very strange: the Intel chip in the top of this graph. core I9 10900k is a desktop chip that uses 125 watts of power, it's the kind you have to plug into the wall, while Apple's A14 chip in the iPhone 12, the one in the top right corner, performs better than the chip I just mentioned. 5 watts and runs on a phone battery.
YouTuber Jonathan Morrison tests exporting a heavy video file on a desktop versus an iPhone 12. Check this out, this is a pretty capable machine with 10 cores, it's got a beefy GPU and what I've got here, uh , Sony fx9 10-bit HDR footage. Let's get started. I'm going to open this. Let's look at a couple of things. In the meantime, I'm listening to the fans as this progresses I have the iPhone 12 mini, small, please note this is still moving slowly I'm going to open iMovie, create a movie, the first thing you notice is just the scrubbing in playback on the right, so this is a tiny fanless iPhone 12. and just butter, so what I'm going to do is export and make sure we're at 4k with 20 HDR batteries.
Note that this still works, it has a nice little head, so this has about a 20 battery, no fan left, and this is a Decked Out iMac. What's happening here tonight is that the iPhone chip has hardware acceleration, but you can argue that this is simply a smarter way of doing things anyway. The M1 chip is even faster than this, looking at this and the speed at which Apple's chip performance was accelerating, Apple was practically forced to ditch Intel and their x86 Intel platform was too slow to innovate, they were struggling for keeping up with manufacturing technology, while Apple could outsource the manufacturing of its custom built designs to others like Samsung.
Intel desktop CPU performance was evaporating in front of everyone's eyes in 2019, the iPad was more powerful than the previous year's Intel MacBook Pro, this is something I mentioned in a previous episode, everyone's speed these advancements would surprise people again in 2020. In this episode, so far you might be interested in small tech news and today's sponsor, Morningbrew, Apple has announced that starting January 1st they will be reducing their app. Store rates from 30 to 15 this is in response to antitrust accusations from the United States and the EU. Apple also agreed to pay $113 million for its scandal by slowing down old iPhones to make their batteries last longer.
Ask me. I learned about these stories through Morningbrew. It's a great way to catch up on the news in the morning. You can catch up in just five minutes instead of wasting time going to different sources. It is informative and makes the news less tedious and boring. delivered to your inbox every weekday and Saturday and it's free, so there's no reason not to subscribe to Morningbrew. If you are interested in business finance or technology, click the link in the description below to sign up for Morningbrew. It will only take you 15 seconds. Where were we? The chips have been used for low-power uses, such as in mobile phones.
Intel refused to make chips for the iPhone and then Apple decided to do it themselves by acquiring a company and then with this experience Apple has pushed the state of the art. until mobile phone chips were comparable to PCS and now it was time for arm and

apple

to take on the PC in November 2020 Apple reveals its new processor for the PC, it is called M1, the first custom CPU of the company designed with Max Note that it has 16 billion transistors, 35 more than the A14 inside the new iPhone 12 and, because it is based on an arm and shares the same DNA as the iPhone, the new Macbook can run iPad and iPhone apps, all with transistors in a five-nanometer early testing process. of the chip is blowing up the online tech review community and proceeded to annihilate the other tests by posting 1744 on a single core, which is faster than any Intel Mac ever made, now Badeem really wants to jump in front of the camera and give Your opinion. true, I have a base 2018 15 inch MacBook Pro.
I tried playing League of Legends, it's not smooth, it's glitchy, frame drops and I'm impressed that we're getting native resolution, everything maxed out, around 60 FPS and you have a dedicated graphics card yes it has a dedicated ship and this is under Rosetta not even on time it's not even optimized for Apple Silicon these are legit this is super fast to the point where I had a friend who actually makes a pretty well known app , uses a really decked out 16 core Hackintosh 3950x system specifically for development, this is its build time, somehow these M1 devices keep up, one of them fanless, it's pretty crazy and this was the moment I got back the numbers. this is it this is real Apple silicon is real if Apple as a company continues on its current performance trajectory it will be crazy to see a quote here from Ananda Tech Apple claims the M1 is the fastest CPU in the world.
According to our data, the previous A14 outperformed all of Intel's designs and fell short of AMD's newer Zen 3 chips. We can certainly believe that Apple and the M1 will be able to achieve that claim. AMD has shown a lot of progress lately, but it will be incredibly difficult to match Apple's power efficiency. If Apple's performance trajectory continues at this pace, the x86 performance crown may never be regained. End of quote, when Arm's broader real-world applications become known, it was only a matter of time before the fastest. The world's largest supercomputer, the Japanese fugaku, is three times faster than IBM's and uses an ARM architecture.
Although this is the deciding factor, needing a more powerful computer is not relevant for everyone, but having more efficiency and less power consumption is, and it is something that everyone can notice. Clay Christensen, who wrote the book The Innovator's Dilemma in 1997, states that disruptive innovations do not involve new technologies, but instead consist of components built around proven technologies that are combined into novel architectures that offer the consumer a set of attributes never before seen. previously available and continues: They were initially only used by unsophisticated consumers in low-end markets, so ARM chips have been around since the 1980s, but these days they have only been traditionally used for low-end items like phones smart ones, for example, although putting ARM chips in a laptop has Windows has been around a long time and has done this for years, but having a chip that is so powerful that it can reach the top end of the consumer computer market is new .
Christensen, who wrote this quote, shows a graph of disruptive technologies 25 years ago, look carefully. What does it look like? Steve Jobs even had something to say about it, speaking at the launch of his next computer, he said that all systems and architectures have about 10 years of life at the beginning, you have to get people to create applications for them and around 5 years It reaches its maximum architectural value and then continues on what could be called a glide slope. Architecture is all that will be at that moment. Foreign, so what about program compatibility? According to early reviews, the CPU is so powerful that it performs similarly or better running applications that weren't even designed for it according to the edge.
It's so good that they couldn't even tell if the running apps were coded for arm or not. In conclusion, despite not having an upgradeable RAM, a considerable upgrade price. and there is no external GPU support, the big story here is the chip architecture and nothing else, what he shows is that arm may have an inherent advantage in the laptop and desktop space. This moment took decades to develop. I feel honored to have been present to witness how things are. It's going to get super interesting from here on out and I can only hope that the competition arises from here on out.
This is so that we as consumers can win, although it is strange how there are those who get upset about this. Can't we just enjoy the progress? that is being done instead of ridiculing each other for our technological preferences. I'm just happy that technology is moving and competition can come from it. This performance increase is the biggest gap we've seen in many years and I think it's incredible and That's the end of the story so thanks for watching, are you excited about future laptops that have extremely long battery life? long and have the CPU power of desktop computers?
These types of machines can be less expensive and work wonders for students and professionals alike. Anyway, that's all. I do want to make a quick mention that I have some new products just in time for the holiday season, so if you're feeling festive you can grab some in the description below anyway, my name is gogo and I've been watching Cold Fusion and I'll see you again soon for the next episode. Greetings guys, have a good time. Cold Fusion, it's a new way of thinking.

If you have any copyright issue, please Contact