Apple and Google, two companies so diametrically opposite that just the thought of them heading in the same direction is unfathomable. The developer conferences of both companies are well behind us, putting out a plethora of software, some flashy features, and a clearer direction in which both the companies and their software platforms are headed.
Both companies have a long list of software to go through. Therefore, both Apple’s WWDC and Google I/O includes a long list of software that looks and feels like a scattered bunch. It’s not your fault. Apple has at least four major software announcements to make. The same is with Google.
I would like to give a few reasons to support my claim. Both Apple and Google are releasing dashboards which will show you the amount of time for which you are using your device (too much). Apple has finally made strides to fix its notification game. It has finally introduced Grouped notifications, letting you dismiss notifications from an app altogether. It also gives you the ability to turn notifications off without visiting the settings app. These features already exist on Android.
The new Apple Photos app in iOS 12 borrows a lot from Google Photos. It has introduced a “For You” section that puts neat effects on your photos. Advanced search has also been added so that you can type in various search terms to find the exact photos you are looking for. It has also introduced suggested sharing; in which it identifies the people in your pictures and offers to create shared albums with them. All these features already exist on Google Photos albeit their methodology of implementation is different. Apple uses end-to-end photo encryption and its AI works on-device, whereas Google uses its cloud infrastructure.
Now take the Shortcuts app on iOS and Slices on Android. Both are attempts to make communication better between Siri and Google Assistant with apps better. These apps are also a manifestation of how different the philosophies of both these companies are. In case of Google (Android P), app developers just make a bunch of stuff available to the Assistant and then you just have to trust Google to figure out what you want, and when you want it. Though given the massive infrastructure Google already has, it can do this job very well.
On shortcuts, on the other hand, you have to do a lot of configurations yourself. You have to look for “Add to Siri” button and create a trigger word for the configuration. You can also club multiple actions together. Therefore, on Android P, you just have to trust Google and on iOS you have to configure it.
Hence, it’s fair to say that Apple is acting a bit more like Google. Though it would also be fair to point out that both these companies see the same trends in tech and computing and are therefore, shaping their platforms in a complementary way.
For all their similarities, however, there is one big difference in these two companies. When Apple rolls its software update out, it will be in the hands of much more users than Android P will be when it comes out. That is one big advantage Apple has had and will have for the foreseeable future.