A13 Bionic: this is the brain of 8,500 million transistors that mount the new iPhone

by Kelvin
A13 Bionic: this is the brain of 8,500 million transistors that mount the new iPhone

Each new generation of iPhone raises the same question: is it worth the change? Answering that question would require an analysis of each particular case, because it is not a yes or no. One of the most common questions asked to me from social networks (or arguments to justify yes or no change) is that the CPU "alone" is 20% faster than the A12 and that is not so much. And we return to the same dichotomy: for some the increase will justify and for others not.

The problem is that measuring a CPU for its speed increase is like measuring a car for its maximum speed. If we do that we forget that the car is not only something that runs: a car has consumptions and perhaps a more efficient engine that “only” gains 10Km / h of maximum speed, it turns out that it consumes 20% less fuel. That new model will have more comfortable seats, a better multimedia system with Apple Car, will improve in its assessment of passive safety against accidents, will allow LTE internet connection … There are many more factors to consider, the most important, your ability to perform specific tasks.

  

A CPU is much more than its speed and to value this for just this criterion is to stay in nothing but one of many improvements that should compute in our assessment.

Algorithms

Before entering to speak directly of the A13, we are going to put order in the so compassionate comparisons that are made today, mainly to put on the table the sufficient knowledge to allow us to value the change in its fair measure. Not only know the A13, understand its differences with previous generations.

The star question is: "If the new A13 is only 20% faster, why can't an XS with the A12 not take photos in night mode like the iPhone 11?". Simple: the algorithms and functions of the CPU. It is not a whim of Apple: is that the A12 does not have the ability to make night photos in the way that Apple You have implemented this function. Your algorithm would not work on an A12 by capacity, not by speed.

An algorithm is the set of instructions we give a computer to do something.

In the case of a night photography mode (to name the specific example) we have different ways of doing it. Different algorithms: Calculation operations that are performed with each and every pixel in the photo and the information that the lens gives us, to obtain a photograph in this mode.

We will explain in a simple way the differences between algorithms and how we have many solutions to achieve the same goal (with different levels of efficiency). Imagine that I want to add (in Swift) all the numbers in a sequence. I could do it like this:

Swift code to add an array

But you could also use a loop:

Swift code to add an array using a loop for loop

The latter is a more efficient algorithm. But it is even more so if I use functional programming and do:

var sum = array.reduce (0, +)

All these instructions do exactly the same. They get the same result. But some are more optimal than others. They spin more or not, they need more instructions or less.

Neural Cam vs iPhone 11 (A13)

Seen and understood this we take the next step: photography in night mode with the new iPhone or with the well-known Neural Cam app. Some will think: “If the Neural Cam app does it from an iPhone 6, why Apple just allow it with the A13? ”. The problem is that we can never think that an end result (and its quality) does two different processes

The same (or similar) final result does not imply that the algorithm or method used is the same. There are infinite paths to reach Rome.

We will quickly analyze how Neural Cam photography and the new iPhone do to see the difference.

Neural Cam

The Neural Cam app asks us to focus where we want to take the picture and tries to focus (even if there is little light). Set the focus and ask us to launch the photograph. Then makes a capture by opening the target for two seconds, where we must have the mobile without moving it. Here, the optical stabilization capability of our mobile will be key. When it ends, in a process that lasts about 10 seconds (It could last longer if our mobile is older), take all the information you have captured in those two seconds and reduce the size of the photo: for what? To eliminate the possible movement in it.

Neural Cam, although using Machine Learning, does not use the chip components of Apple. Use the CPU, which takes 10 seconds or longer to process each photo.

It is a great algorithm that makes use, for everything, of the brute force of the CPU without using any specific component. If we download an app of € 3.49 to take night photos that our device does not do and we have to wait 10 seconds or more for them to be taken, we will understand it because it is NOT a native function. It is a third party app and we are more permissive. If this put it Apple in their current devices, the torches would be at the doors of the Apple Park.

And this is important to understand: uses machine learning, but does NOT use the neural motors of the latest generations of processors Apple that only work with the capabilities of the development kit Apple. To run your model, use the raw force of the CPU and therefore it takes 10 seconds or more. And that's why it works on any device with iOS 12. You have to understand that any algorithm, whatever it is, can always be run against the CPU: but it will consume more energy and be slower.

iPhone 11 Pro

The night mode however in iOS, natively, is done automatically and it takes just a second to get the result. When we are going to launch the photo, depending on what the device is seeing in the preview changes the way of taking the photo.

Ways to take a picture in night mode there are many. Neural Cam takes 10 seconds, the Pixel 3 takes 6 seconds … there are better or worse but Apple He has decided that his will use a component that has the A13 chip and not the rest, to get it done in less time and improve the experience. If you don't like it, you'll always have Neural Cam or buy a Pixel.

In this generation of iPhone, these are able to analyze and obtain results of their own feed We see before launching the photograph. A feed which is still a real-time video, what the camera is seeing as is. Depending on what you see and how you see it, if it is a lens that moves a lot or a little, it changes the way you take the picture. If we put the camera in front of a fixed object and leave the still image, the system takes several photographs at the same time with longer exposure. But if it detects that we are moving or the objects that pass through the camera's camera move faster, it will take more faster photos with different exposures to add more results with different exposures.

Night Mode iPhone 11

Therefore, the iOS 13 night mode for new iPhone uses Machine Learning to detect the semantics of photography: to understand what is seen in the photograph: its shadows, what moves, what doesn't, how we place the camera, if it is a fixed plane or not, at what distance, if what we are taking is still or moving … This is called image semantics, a feature that only the A13 chip has in its neural motor.. The ability to recognize the contents and analyze them, even in preview mode and that a device could not do.

Before talking about the A13 itself, it seemed essential to understand this difference. Could Apple make that night mode not use these features put on the A13 and backward compatible? If it did, the mode would be slower and, for example, I wouldn't be able to recognize the semantics of the camera preview at 60 images per second, which would force you to look for another perhaps less efficient solution such as Neural Cam. So we would have a bad user experience, something he flees from Apple as we all know.

A13

The A13 is a huge leap forward in architecture. Not that it is 20% faster, which is the least. It is the quantity of components and new small parts that help the whole in total as we have commented (if we use native development).

The team of Apple of chips have been working for about 10 years, since April 2008 that Apple bought the company PA Semi and in January 2010 the first device with its own chip was presented Apple: the iPad. This is the tenth chip they take out, if we do not count the X variations that have had some generations that only incorporate more CPU or GPU cores.

The first chip of Apple, the A4, inside the first iPad

The Bionic A13, which has been improved in each and every component and added some new ones, it's not just cpu. Now that we have had time to analyze we can see the following components:

  • High efficiency sound processor With 24-bit audio support. A DAC with advanced AAC or HE-AAC support, both version 1 and 2, is also capable of playing MP3, FLAC and many other audio formats. A processor that is also able to calculate the wave alterations of the spatial sound of the new iPhone, which creates a computational surround sound as HomePods do, together with the ability to decode a Dolby Atmos file, also supported at the sound level. And another function: it is able to extend the sound of a specific element of the sound spectrum. If I focus something on a video, what focuses and generates sound can increase the volume (zoom in on the sound) to that element that generates the sound.

Surround sound on iPhone 11

  • Screen management processor and its frequency. We must not forget that the screen of the new iPhone Pro is the same as the Pro Display XDR monitor. It has an extended HDR mode, an average brightness of 800 nits with 1,200 peaks and a contrast of 1: 2,000,000. This processor establishes a dynamic refresh of the screen so that when some pixels do not change its content, its refreshment is less and with it improves the efficiency of the battery considerably. The Retina Pro XDR screen of the iPhone Pro does something similar and is capable of not sending a refresh signal to the screen if its content has not changed and thus save battery. Up to 15% higher energy efficiency than the XS or XS Max. It works in conjunction with a OLED display processing processor To manage the pixels and how they light up, it uses the Pro's screen engine to refresh or paint.
  • High performance unified memory processor. This is a technology that Nvidia uses in its CUDA library and in its GPUs. It basically allows you to access any part of the entire memory system and at the same time, migrate data on demand to the dedicated memory of each component with broadband access. In this way, in a single operation, the component that accesses a content in the general memory will obtain a migration of this direct data to its own memory area or registers.

iPhone 11 HDR

  • HDR image processor, which is responsible for receiving the image and directly calculate the improved SmartHDR process. In this way, there is no CPU process and the treatment is instantaneous. That is why previous models cannot make this HDR improved: because a software algorithm that we can put on another computer does not: it does a chip that is specifically programmed for it.
  • Computer photography processor, which helps to interpret the content of the image and coordinate the processes that are carried out and the fusion operations, which entrusts them to another component that is responsible for receiving photos with instructions and calculating the fusion.
  • Processor of the components that are always on, and they must work in low energy mode. How to manage a connection Bluetooth When the device is locked, the GPS signal that enters constantly or the sound heard in the background. It allows not to use the entire CPU in operations that are understood should consume much less to be in the background.
  • Neural motor, which allows you to run trained Machine Learning models. All AI processes depend on it in terms of the process of neural, convolutional networks and all kinds of networks that process data and obtain probabilistic results.

Machine Learning Controller

  • Secure Enclave, an essential part of iPhone security. Save the biometric data, and encrypt and decrypt data transparently in a black box that never makes the encryption keys public if it does not return the encrypted data, as is. It also allows you to check the signature validation of any process. The A13 also has a support for the Secure Enclave: a cryptographic calculation accelerator for decryption, encryption and faster calculations.
  • Generic high-performance CPU cores, two in particular, faster and with greater capacity and performance. Dedicated to more complex operations that require more load.

Energy Efficiency Cores

  • Energy Efficiency Cores, 4 cores that perform more basic and simple operations that do not require so much demand and therefore, being slower and with fewer components, doing a basic operation with these saves battery compared to making that operation less demanding in a core of more performance .
  • Performance processor which is based on a Machine Learning component outside the neural motor that analyzes each process and its demand, classifying which instructions or processes require more CPU load or less. If that learning engine understands that this process requires less power it will send it to a core of energy efficiency and if not to high performance ones. No processor Apple I had put an AI engine directly in charge of this task. They have relied on it, but now it is 100% AI.

A13 Capabilities

  • Video encoding and decoding processors in H264 and H265 matrices, responsible for HD or 4K video up to 60fps. The video also includes a processor responsible for processing and returning the HDR information in the video.
  • Depth motor, which calculates the different depth layers of the photographs or images
  • Storage processor, is responsible for managing storage directly, both internally and externally through the port lightning with USB adapter.
  • Matrix multiplication accelerator, a dedicated chip that we talked about not long ago, which allows to accelerate the multiplication calculation of matrices of several dimensions up to 6 times.

Finally, one of the most important: the energy manager or PMU (power management unit). This PMU is a specific chip that distributes power through the CPU based on the demand of each component and what it is doing.

Zones that receive energy in the chips, minimizing the expenditure by zones Zones that receive energy in the chips, minimizing the expenditure by zones

Each component of the CPU is divided by energy zones physically. This assumes that when a chip is being used, it does not happen as before that the entire chip receives power when perhaps only a specific transistor zone is used. Now, being divided by closed and isolated areas, the PMU instructs the chip to send the exact energy for each operation to the exact zone and when it is finished, stop supplying it so as not to leave it on.

It is as if we have our house with many rooms and We only turn on the lights when necessary because there is someone inside but when it comes out, our presence detector informs and the PMU turns off the light. That is one of the keys (along with the best efficiency of the OLED screen) that makes a battery slightly larger than the previous models, get up to 4 hours of increase in autonomy with the 11 Pro and 5 in the 11 Pro Max (or an hour on the iPhone 11).

A13

Without a doubt, a chip that is a magnificent engineering work, in 7nm printing with a second generation process and that in total has 8,500 million transistors that give life to the different specialized components, in addition to the CPU, GPU and neural motor set that in this generation work as a single set.

I just hope that after this article, let's value a little more the incredible work of innovation that this A13 chip supposes, the brain of the new iPhone. Because remember that innovation according to the SAR is: "Creation or modification of a product, and its introduction into a market.". And if it's not clear to us, innovating is: "Move or alter something, introducing news.". Lighter, water.

This site uses cookies to improve your experience. Accept Read more

Privacy Policy
escort malatya escort bursa escort antalya escort konya mersin escort
konya escort antalya escort malatya escort malatya escort bursa escort hatay escort kayseri escort kahramanmaraş escort niğde escort mersin escort