Exploring the Apple…How Apple is getting benefits from AI/ML. How Apple is providing AI to their products. How Apple became the first company to enter the US$2 trillion in the world.

Subhashis Paul
7 min readOct 20, 2020

--

“COMPUTERS ARE ABLE TO SEE, HEAR AND LEARN. WELCOME TO THE FUTURE” — DAVE WATERS

Yes this is absolutely right. We know computers has no brain itself but they can learn from us. How do we, human beings learn? We see, we hear, we feel, we experience and then we learn. Computers can also see us, they can hear us. So why they can’t learn? Obviously they can learn. To learn them, we need to code some programs because computers only know the binary language(0 and 1). This program, learning computers to predict is called Machine Learning or you can say Deep Learning.

Before starting the discussion, I just want to say that I think in our daily life, most of the guys are using LinkedIn, Facebook, Google, Instagram, Twitter, Netflix etc. In LinkedIn, when someone texts to chat box, LinkedIn shows some predicted messages to send. If I enter any letter in the search bar, it shows the popular person’s name starting with the letter or name of most visited profile. Can you tell me how they are predicting me? In Netflix, after watching some movies they predict us some movies to watch next and sometimes it becomes correct, yes! I want to see exactly this movie now. How they are predicting us? Space stations are sending Rockets, satellite to space without human beings. But it is working perfectly. How? Every predictions those are happening in our technical daily life, everything is possible for Machine Learning, “ARTIFICIAL INTELLIGENCE”. So I think I am very much clear about what Machine Learning does?

Nowadays big big MNCs are using Machine Learning to grow their business, to create user-friendly experience between customers and companies. Here are some top companies’ name who are getting benefited.

1. Google
2. IBM
3. Baidu
4. Microsoft
5. Twitter
6. Qubit
7. Intel
8. Apple
9. Salesforce
10. Pindrop

Apple:

Apple is an American multinational company headquartered in California, which designs, develops, and sells consumer electronics, computer software, and online services. Founded in 1976, by Steve Jobs, Steve Woznaik, and Ronald Wayne, the company became recognized in delivering a range of technically advanced mobile phones known as iPhones. Following the success of iPhones, the company indulged in the development of other products such as iPad, AirPods, and Apple TV amongst other. In September 2020, Apple became the first company to enter the US$2 trillion in the world.

What is Apple’s AI strategy?

I think that Apple has always stood for that intersection of creativity and technology. And I think that when you’re thinking about building smart experiences, having vertical integration, all the way down from the applications, to the frameworks, to the silicon, is really essential… I think it’s a journey, and I think that this is the future of the computing devices that we have, is that they be smart, and that smart sort of disappear.

Apple is best positioned to “lead the industry” in building machine intelligence-driven features and products.

How does Apple use machine learning today?

Apple has made a habit of crediting machine learning with improving some features in the iPhone, Apple Watch, or iPad in its recent marketing presentations, but it rarely goes into much detail and most people who buy an iPhone never watched those presentations, anyway. Contrast this with Google, for example, which places AI at the center of much of its messaging to consumers.

There are numerous examples of machine learning being used in Apple’s software and devices, most of them new in just the past couple of years.

Machine learning is used to help the iPad’s software distinguish between a user accidentally pressing their palm against the screen while drawing with the Apple Pencil, and an intentional press meant to provide an input. It’s used to monitor users’ usage habits to optimize device battery life and charging, both to improve the time users can spend between charges and to protect the battery’s long term viability. It’s used to make app recommendations.

Then there’s Siri, which is perhaps the one thing any iPhone user would immediately perceive as artificial intelligence. Machine learning drives several aspects of Siri, from speech recognition to attempts by Siri to offer useful answers.
Savvy iPhone owners might also notice that machine learning is behind the Photos app’s ability to automatically sort pictures into pre-made galleries, or to accurately give you photos of a friend named Jane when her name is entered into the app’s search field.

  • Facial recognition for HomeKit. HomeKit-enabled smart cameras will use photos you’ve tagged on your phone to identify who’s at your door and even announce them by name.
  • Native sleep tracking for the Apple Watch. This uses machine learning to classify your movements and detect when you’re sleeping. The same mechanism also allows the Apple Watch to track new activities like dancing and…
  • Handwashing. The Apple Watch not only detects the motion but also the sound of handwashing, starting a countdown timer to make sure you’re washing for as long as needed.
  • App Library suggestions. A folder in the new App Library layout will use “on-device intelligence” to show apps you’re “likely to need next.” It’s small but potentially useful.
  • Translate app. This works completely offline, thanks to on-device machine learning. It detects the languages being spoken and can even do live translations of conversations.
  • Sound alerts in iOS 14. This accessibility feature wasn’t mentioned onstage, but it will let your iPhone listen for things like doorbells, sirens, dogs barking, or babies crying.
  • Handwriting recognition for iPad. This wasn’t specifically identified as an AI-powered feature, but we’d bet dollars to donuts it is. AI is fantastic at image recognition tasks, and identifying both Chinese and English characters is a fitting challenge.

In other cases, few users may realize that machine learning is at work. For example, your iPhone may take multiple pictures in rapid succession each time you tap the shutter button. An ML-trained algorithm then analyzes each image and can composite what it deems the best parts of each image into one result.

AI is behind Apple’s handwashing assistance feature in the Apple Watch

Phones have long included image signal processors (ISP) for improving the quality of photos digitally and in real time, but Apple accelerated the process in 2018 by making the ISP in the iPhone work closely with the Neural Engine, the company’s recently added machine learning-focused processor.

There’s a whole bunch of new experiences that are powered by machine learning. And these are things like language translation, or on-device dictation, or our new features around health, like sleep and hand washing, and stuff we’ve released in the past around heart health and things like this.

It’s hard to find a part of the experience where you’re not doing some predictive [work]. Like, app predictions, or keyboard predictions, or modern smartphone cameras do a ton of machine learning behind the scenes to figure out what they call “saliency,” which is like, what’s the most important part of the picture? Or, if you imagine doing blurring of the background, you’re doing portrait mode.

All of these things benefit from the core machine learning features that are built into the core Apple platform.

Further, you may have noticed Apple’s software and hardware updates over the past couple of years have emphasized augmented reality features. Most of those features are made possible thanks to machine learning.

Machine learning is used a lot in augmented reality. The hard problem there is what’s called SLAM, so Simultaneous Localization And Mapping. So, trying to understand if you have an iPad with a lidar scanner on it and you’re moving around, what does it see? And building up a 3D model of what it’s actually seeing.

That today uses deep learning and you need to be able to do it on-device because you want to be able to do it in real time. It wouldn’t make sense if you’re waving your iPad around and then perhaps having to do that at the data center. So in general I would say the way I think about this is that deep learning in particular is giving us the ability to go from raw data to semantics about that data.

Increasingly, Apple performs machine learning tasks locally on the device, on hardware like the Apple Neural Engine (ANE) or on the company’s custom-designed GPUs (graphics processing units).

Conclusion:

Machine Learning can be a competitive advantage to any company be it a top MNC or a startup as things that are currently being done manually will be done tomorrow by machines. Machine Learning revolution will stay with us for long and so will be the future of Machine Learning.

Thank you.

--

--

No responses yet