top of page

Medium-Term (Five-Week) iPhone 16 Review

Nov 3, 2024

13 min read

0

28

0

Another Year, Another iPhone. Does Apple Deliver Another Hit? Here is my iPhone 16 Review.

iPhone on a table

Introduction


Welcome to my annual iPhone review! This review is big because I’m reviewing one of the most consequential tech products of the year—the iPhone 16.


I picked up the iPhone 16 in Ultramarine, and I’ve been using it for about five weeks now. Yep, I decided to go with the non-pro iPhone, which is the model most people in the world will buy when they buy a new iPhone. Although the Pro models are Apple’s most technologically advanced models, they aren’t what will be purchased the most, and partially that’s why I went this route.


However, there are three other reasons I decided not to go with the Pro model this year.


First, is the processor. I’ll get into the performance of the iPhone 16 later, but the new A18 chip is 2 generations better than the A16 Bionic that was in the iPhone 15, and it’s really good. So I don’t feel a strong need to go up a model and get the A18 Pro chip.


Another thing that the Pro phone has is a telephoto lens for 5x optical quality zoom. While I do appreciate having great photos, I don’t consider myself a photographer and so the extra lens is not as appealing to me as it might be for some.


Thirdly, for the last couple of years, I’ve been using the iPhone 14 Pro Max, which has a 6.7-inch display. Well, quite frankly one of the things I didn’t like about it was the size. It wasn’t comfortable to use with one hand, it didn’t always fit in my pocket well, and it was kind of just too big of a phone. And so I decided I’d like to scale things down significantly with the 6.1-inch display on the iPhone 16.


So that’s how I ended up with this iPhone 16. And it’s been out now for several weeks. This is somewhat of a long-term review. Let’s dig in!



Physical Design


Let’s talk about the design of the phone for a minute. Apple decided that this year they would add more color to the lineup. The available colours are Black, White, Pink, Teal, and Ultramarine, which is what I ended up purchasing. I think it’s a nice colour. It looks good on the aerospace-grade aluminum rails, and the color-infused glass back.



the available colours on Apple’s website
The available colours on Apple’s website

Apple has maintained its IP68 waterproof and dustproof design as well. On the front of the phone, Apple has their custom ceramic shield glass, which they claim is 2x tougher than any other smartphone glass.


Beyond that, there are a couple of new and notable characteristics of the phone. For one, there are two new buttons on it.



Camera Control


iPhone laying on the table - close up of the camera control
Close-up of the camera control button on the side of the phone

The button on the bottom right is the new camera control. It’s a dedicated hardware button that opens the camera immediately. Apple knows that so many people use their phones to take pictures, and the quicker you can access your camera app, the better. One single press of the button will open the camera app. Pressing the button again will allow you to take a photo.


Pressing the button once to launch the camera, pressing it again to take a picture
Pressing the button once to launch the camera, pressing it again to take a picture

The button has pressure sensitivity so a light press will bring up your fine-tuning while a double tap will bring up the different options for fine-tuning the various settings like zoom, exposure, styles, tone, and depth of field. And a hard press will take a photo.


swiping through the various settings
Swiping through the various settings

Apple plans to update this button later in the year to support Visual Intelligence, allowing you to use the camera for tasks like scanning objects or locations for information. In their iPhone presentation, they provided examples like holding your camera up to a restaurant to pull up reviews or scanning a dog to find out what breed it is. Since that’s not out yet, I can’t review it to tell you how well it works, but I’m definitely excited about that.



Action Button


The other new physical button is the action button, located right above the volume buttons on the phone's left side.


Closeup of the action button
Closeup of the action button (right above the volume buttons)

The action button serves as a replacement for the old ring/silent switch found on previous phone models, which allowed users to quickly toggle their phones between silent and ringing modes. In addition to performing this function, the action button can be customized to perform a variety of other tasks. It’s essentially a specialized button that can be set to do one thing, and that one thing is whatever the user decides. In settings you can choose between all of these options:


  • Silent Mode: switch between silent and ring

  • Focus: offers you the ability to toggle between focus modes, which helps eliminate distractions

  • Camera: open the camera app quickly.

  • Flashlight: quickly activate the flashlight

  • Voice Memo: you can quickly start recording a personal note.

  • Recognize Music: easily activate the built-in Shazam function which can recognize the song you’re hearing.

  • Translate: Opens the translate app to translate a phrase or have a conversation with someone.

  • Magnifier: uses the magnification feature to zoom in on objects.

  • Controls: activates one single control of your choosing. Examples include wallet, dark mode, home, stopwatch, airplane mode, personal hotspot, and calculator.

  • Shortcut: open an app or run your favourite shortcut.

  • Accessibility: quickly use an accessibility feature, like a magnifying glass, or color invert.

  • No Action: you can also have the button do nothing if that’s what you prefer.


Screenshot of the setting for the action button
The available options for the Action button

My take on the action button? Personally, I have a shortcut set up to simply open my home security app. That’s the most useful thing I could think of off the top of my head. That’s because when I arrive or leave home, I want to quickly arm or disarm my system, unlock my door, or open the garage door, for example. I’ve always found it a bit too fiddly to find the app and open it in a hurry. But with a single button press and hold, boom, there it is, instantly. That’s not the most exciting use case, but it works well for me.



Apple Intelligence



Apple Intelligence Logo
Source: Apple

Now, let’s shift our focus to Apple Intelligence, Apple's new generative AI platform designed to improve user experience across its devices, including the iPhone, iPad, and Mac. Announced during the WWDC 2024 event and further detailed at the iPhone 16 launch, Apple Intelligence aims to integrate advanced AI capabilities into existing applications while, according to Apple, prioritizing user privacy.


The first thing to mention about Apple Intelligence is that it’s not fully available at this point. In fact, iOS 18.0, which shipped on this device had no Apple Intelligence features at all. Only iOS 18.1, which Apple released just a couple of weeks ago, has some of the features. Apple says it is rolling out its AI features over several upcoming software releases, some stretching into 2025.


As of now - November 2024, there are some interesting Apple Intelligence features available on iOS 18.1, such as:

  • Writing tools for rewriting, summarizing, and proofreading text.

  • Improved Siri functionality with contextual follow-ups and the ability to type to Siri.

  • Cleanup tool in photos for removing unwanted objects.

  • Notification summaries.

  • Suggested replies in Messages and Mail.

  • Custom Memories movies in Photos.

  • Article summaries in Safari's Reader Mode.


There are still several highly anticipated features coming in the next update, such as GenMoji (Apple’s AI-generated emojis), Image Playground (to create AI-generated images), and ChatGPT integration.


So let’s evaluate one by one the features that are available today.



Writing Tools


Writing tools are a set of AI-powered tools that are available anytime you are writing text, for example, in Notes. To access the tools, all you need to do is select the text you’ve written, and you’ll notice that writing tools are one of the available options.


Within writing tools, you have some available options.



Text selected in the notes app
Writing Tools options when you select some text

Proofread. This tool checks for spelling, grammar, and word choice errors, allowing you to review and accept suggested corrections.


The proofread option in writing tools
Proofread showcasing a grammar mistake.

Rewrite. This button will launch Apple Intelligence into action and immediately rewrite your selected text. You’ll usually get a more grammatically correct, professional version of your text if you just hit the big rewrite button. However, you also get style options if you want to have it written a certain way. You can choose writing styles like friendly, professional, and concise.



The rewrite with option to revert to original
The rewritten content is pasted into the document

Summarize. You can of course also summarize your selected text into a simple, few sentences, or into key points.



The option to create a summary
Summarize options pop up under writing tools

Text Reformatting. On top of all that, you can reformat the text to be in list format, or even in a table format. My experience here is that the reformatting to table option will only include text that it deems to be table-worthy. It’s actually an intelligent decision in my books but some people may be surprised by that.



Improved Siri Functionality


Siri also has improved capabilities such as natural language understanding. Siri has a broader understanding based on the upgrades it has received thanks to the new underlying technology of large language models. Even if you stumble over your words, or change your mind midway through your sentence, Siri can still interpret your request. Plus, Siri is context-aware within the timeframe of your open conversation. For example, you can ask follow-up questions without repeating the original context.


An example is asking what the weather is like in San Francisco today, and then asking a follow-up question about the weather tomorrow. Siri will know you’re still talking about San Francisco.



Asking Siri the weather in San Francisco
Asking Siri the weather in San Francisco


Asking Siri the weather for the following day
Asking Siri the weather for the following day

Then there’s Type To Siri, which allows you to have a conversation with Siri without actually having to speak, great for those moments when you are in public or would rather type than talk.


To activate Type To Siri, just double-tap on the very bottom of your screen, on the little home bar. That will bring up a keyboard interface with a message box, along with the familiar glowing screen that Siri has been activated. This works just as well speaking. Just tell Siri what you want to do or know.



Typing to Siri to ask the weather in Miami
Type To Siri


Cleanup Tool In Photos


The Photos app also has a new AI image editing option called the Cleanup tool.  It allows you to remove unwanted elements from photos, including people and things.


The tool intelligently highlights items it thinks you might want to erase. You can see a sort of fluorescent multicoloured lighting effect on those items. All you have to do in that case is tap to erase it.



Clean up tool recommending objects to remove
Clean up tool recommending objects to remove

If the item you want to remove is not highlighted, all you have to do is just circle or scribble over it to initiate its removal.



using the clean up brush to paint over an object to remove it
Using the clean-up brush to paint over an object to remove it


Obviously, your results with this tool will vary depending on how complicated the background is, and how big the object is you’re trying to remove. There are instances where the picture looks considerably worse after editing it because the image it's trying to do just doesn’t look good. In that case, all the edits you make are reversible so you can either undo each erasure or revert the entire image to its original form.


However, I’ve found that in many cases the results have been quite good, even convincing.


an edited image with unwanted people removed
The finished image with unwanted people removed

When you use the cleanup tool, there is a note added in the metadata that states “modified with clean up”, indicating that it’s an AI-edited image.


Notification Summaries


When it comes to one use case that has been a constant with generative AI, it’s the summarization of content. We’ve probably all used ChatGPT for that very purpose. And notification summaries aim to solve a very real problem when it comes to all the information that gets served up on our phones.


Think about it. You probably get notifications for emails, text messages, group chats, and more. Sometimes you can’t tell what the content of an email is just based on the subject line that shows up in your notification. Sometimes you can’t tell what your group chat is talking about by the last message received. And sometimes your partner sends you a message that’s so long you can’t tell what they’re talking about until you go in and read it.


Notification summaries can handle all of those situations. When you get a notification that has been summarized, you’ll notice a little icon on the notification that looks like a paragraph of text with an arrow wrapping around it.


If an email comes in, instead of a subject line you might get a summary of the email itself. For text messages, you might get a summary of the key points of the message. And for group chats where you’ve received multiple messages, you will get the highlights of what the conversation is about.



An email that has been summarized in the inbox
An email that has been summarized in the inbox

The summaries work for other applications as well where you might receive multiple notifications over a short period of time. For example, my security system might send me ten or fifteen notifications in a day due to the system being armed or disarmed, or people who are seen on the cameras. Same with the other smart security cameras I have through the Wyze app.


A summary of events when multiple notifications have been sent from a supported application
A summary of events when multiple notifications have been sent from a supported application

Summaries are a welcome and useful feature within Apple Intelligence.


More AI Features


And it doesn’t stop there because there are more Apple Intelligence features throughout the operating system. For example, you can now create custom memory movies in Photos. Just type in what kind of video you want, based on the photos you know you have - like a memory movie of your child when they were younger and set to a specific type of music you want.


Writing a description of the memory movie you want in the photos app
Writing a description of the memory movie you want in the photos app


The generated movie ready to play in Photos
The generated movie ready to play

There are also suggested replies in Messages and Mail. For example in messages when someone sends you a text message that seems somewhat open-ended, you will automatically get a couple of suggested replies. They aren’t like paragraphs or anything, more like a two-word reply, but it’s a handy feature.


Suggested replies below the message box
Suggested replies below the message box


If you are into reading articles, you will probably already know about Safari’s reader mode. When you go to an article on the web, you can this little icon in the address bar, then tap Show Reader. However, now you’ll see a new Summarize button once in reader mode. Tap it and you get a paragraph or two summary. To me, that’s a nice feature to understand if you want to dive deeper into the article itself, so again a very useful tool.


An Apple Intelligence article summary at the top of the article
A summarized article on the web


Camera System


Closeup of the cameras on the back of the phone

The iPhone 16 features a new 48MP Fusion camera and a 12MP Ultrawide lens. Having 48MP in the main sensor enables you to take high-resolution photos, up to 48MP. However, by default your resolution is going to be set to 24MP, which essentially keeps your file sizes a little smaller while still giving you a big boost in resolution from the previous generation.


Shortly after getting the iPhone 16, I took a trip down to Miami Beach with my family and I took a few photos. I’m an amateur photographer by the most generous standards, but I was quite pleased with the quality of the shots I was able to capture.


Image captured with the 48MP Fusion camera
Image captured with the 48MP Fusion camera

With the enhanced resolution of the 48MP Fusion lens, you can now capture zoomed-in shots up to 2x with optical quality, resulting in a 12MP image at 2x zoom. This is no 3x, 5x, or even 10x that some of the pro smartphones offer, but it’s nice to at least get the 2x. This might be one reason some people choose the 16 Pro since it has an additional telephoto lens that can get you to 5x with optical quality.


Then there’s the ultrawide lens which is 12MP and can be used to capture images with a 120-degree field of view. It’s ideal for expansive landscapes, large rooms, or for otherwise making your photos feel grandiose.


Image captured with the 12MP ultrawide lens
Image captured with the 12MP ultrawide lens

It also enables you to capture macro shots with a high degree of detail. I haven’t put that to the test yet but I’d like to try macro photography out a bit more to get the hang of it. I might write an article on that at some point.


Low light performance is said to have been improved by the 48MP Fusion camera as well. I didn’t put that to a high degree of testing but it seems alright - but not brilliant. This portrait mode shot of me is a little fuzzier than I’d like. Maybe I wasn’t standing still enough. Either way, I wouldn’t say low-light performance is much better than the previous generation.


Low light portrait mode photo
Low light portrait mode photo

Another thing you might notice about the camera system on the iPhone 16 is the layout of the actual lenses on the back of the phone. The straight-line layout allows you to capture spatial photos and video, essentially 3D-looking photos that can then be viewed on the Apple Vision Pro. I don’t have a Vision Pro so I didn’t bother testing that feature out.


Also new on the iPhone 16 are photographic styles. This feature allows you to change the colour composition of the photos you take. Apple pitches this as a way to capture your desired skin tone rendering, since everyone has a sense of how they look in real life - and they want their photos to match. Or sometimes you just like to have a certain style to your photos. You can set a default style in settings, which will guide you through the selection process between the available options of Standard, Amber, Gold, Rose Gold, Neutral, or Cool Rose.


Screenshots of the different photgraphic styles
The different photgraphic styles

But it doesn’t stop there. When editing a photo there are even more options, and you can change the photographic style at any time. You get additional options like Vibrant, Natural, Luminous, Dramatic, Quiet, Cozy, Ethereal, Muted Black and White, as well as Stark Black and White.



editing a photo and scrolling through the different photographic styles
editing a photo and scrolling through the different photographic styles


Having non-destructive editing means that you can save a photo with a specific photographic style, but then change things up later on as you see fit.


A18 Chip


A18 Chip graphic
Source: Apple

No iPhone review would be complete without a discussion of the system on a chip. Along with the new iPhones this year, Apple released the A18 system on a chip, as well as the A18 Pro that is in the iPhone 16 Pro. Chips that they say are custom-built for Apple Intelligence.


The A18 is about 30% faster than the A16 Bionic that was in the iPhone 15 last year. It’s got six processor cores, consisting of two high-performance cores and four efficiency cores. The A18 features a next-generation 16-core Neural Engine that's twice as fast as the one in the A16. This results in faster and more efficient machine learning tasks, including image processing and natural language processing.


The iPhone 16's A18 chip brings forward a significant gaming development. The standard iPhone models now get to play AAA games that were once only for the Pro models. That's because the A18 has a 40% GPU boost compared to the last chip, and it also has hardware-accelerated ray tracing.



Image of iPhone with Assassin’s Creed image

I’m not big into mobile gaming myself, but in my real-world usage, the iPhone 16 is lightning fast, just as you’d expect from a brand-new iPhone. The A18 is a top-tier chip and in most cases, it's at the top of the charts when it comes to CPU performance in smartphone chips. The Apple Intelligence features I mentioned before also work well, every time, without a big delay or loading period, and I do believe those are computationally intensive tasks.



Conclusion

iPhone standing on it’s edge

Overall, the iPhone 16 is a great phone, which makes a lot of sense, because Apple usually produces winners. In this case, I applaud them for trying some new things with the Action Button and the Camera Control, particularly. I’m not 100% convinced that these additions will be long-term additions, but time will tell. For example, I find myself using the Camera Control frequently, but not so much with the Action button. But that’s just me.


The A18 chip is a powerhouse, enabling smooth multitasking and the ability to tackle demanding AI workloads. The camera system, with its 48MP Fusion camera and 12MP Ultrawide lens, delivers solid image quality and works well in most situations.  


Apple Intelligence, though still in its early stages, demonstrates the potential to redefine the user experience.  Features like enhanced Siri functionality, the Cleanup tool in Photos, and Notification Summaries offer a glimpse into the future of AI capabilities.


Overall, the iPhone 16 is a device that caters to a wide range of needs and preferences, making it a compelling choice.  With its impressive performance, solid camera system, Action button, and the promise of evolving AI capabilities, the iPhone 16 is worth your consideration as your next phone.


Video Review


If you've made it this far, you might also enjoy my video review of the iPhone 16, posted on YouTube. Check it out below, and thanks for supporting my content!







Nov 3, 2024

13 min read

0

28

0

Comments

Share Your ThoughtsBe the first to write a comment.
bottom of page