Apple’s Mac Pro ships in December with maximum 8TB of storage

Apple is making its Mac Pro and Apple Pro Display available in December, it announced today. The machine was announced earlier this year but no availability window had been set.

In addition to the previously announced specs, Apple also noted that it would be able to be ordered with up to an 8TB SSD. Apple’s Pro Workflow Team continues to take feedback about wants and needs of its pro customers and Apple says that the MacBook Pro can now handle up to 6 streams of 8K Pro Res vide, up from 3 streams quoted back in June. 

Apple also says that Blackmagic will have an SDI to 8K converter for productions using a serial digital interface workflow on set or in studio. This was a question I got several times after Apple announced its reference monitor to go along with the Mac Pro. This makes it more viable for many on-set applications that use existing workflows. 

I was able to get a look at the Mac Pro running the SDI converter box as well as a bunch of other applications like Final Cut Pro and it continues to be incredibly impressive for pro workflows. One demo showed 6 8K Pro Res streams running with animation and color coding in real time in a pre-rendered state. Really impressive. The hardware is also still wildly premium stuff. The VESA mount for the Pro Display XDR alone feels like it has more engineering behind it than most entire computers.

The new Mac Pro starts at $5,999 for tis base configuration, which includes 32GB of RAM, a 256GB SSD and a Radeon Pro 570X graphics card, and the Pro Display XDR 32-inch reference quality monitor that Apple will sell alongside it starts at $4,999.

MacBook Pro 16” first impressions: Return of the Mack

In poker, complacency is a quiet killer. It can steal your forward momentum bit by bit, using the warm glow of a winning hand or two to cover the bets you’re not making until it’s too late and you’re out of leverage. 

Over the past few years, Apple’s MacBook game had begun to suffer from a similar malaise. Most of the company’s product lines were booming, including newer entries like the Apple Watch, AirPods and iPad Pro. But as problems with the models started to mount — unreliable keyboards, low RAM ceilings and anemic graphics offerings — the once insurmountable advantage that the MacBook had compared to the rest of the notebook industry started to show signs of dwindling. 

The new 16” MacBook Pro Apple is announcing today is an attempt to rectify most, if not all, of the major complaints of its most loyal, and vocal, users. It’s a machine that offers a massive amount of upsides for what appears to be a handful of easily justifiable tradeoffs. It’s got better graphics, a bigger display for nearly no extra overall size, a bigger battery with longer life claims and yeah, a completely new keyboard.

I’ve only had a day to use the machine so far, but I did all of my research and writing for this first look piece on the machine, carting it around New York City, through the airport and onto a plane where I’m publishing this now. This isn’t a review, but I can take you through some of the new stuff and give you thoughts based on that chunk of time. 

This is a re-think of the larger MacBook Pro in many large ways. This is a brand new model that will completely replace the 15” MacBook Pro in Apple’s lineup, not an additional model. 

Importantly, the team working on this new MacBook started with no design constraints on weight, noise, size or battery. This is not a thinner machine, it is not a smaller machine, it is not a quieter machine. It is, however, better than the current MacBook Pro in all of the ways that actually count.

Let’s run down some of the most important new things. 

Performance and thermals

The 16” MacBook Pro comes configured with either a 2.6GHz 6-core i7 or a 2.3GHz 8-core i9 from Intel . These are the same processors as the 15” MacBook Pro came with. No advancements here is largely a function of Intel’s chip readiness. 

The i7 model of the 16” MacBook Po will run $2,399 for the base model — the same as the old 15” — and it comes with a 512GB SSD drive and 16GB of RAM. 

Both models can be ordered today and will be in stores at the end of the week.

The standard graphics configuration in the i7 is an AMD Radeon Pro 5300M with 4GB of memory and an integrated Intel UHD graphics 630 chip. The system continues to use the dynamic handoff system that trades power for battery life on the fly.  


The i9 model will run $2,699 and comes with a 1TB drive. That’s a nice bump in storage for both models, into the range of very comfortable for most people. It rolls with an AMD Radeon Pro 5500M with 4GB of memory.

You can configure both models with an AMD Radeon Pro 5500M with 8GB of GDDR6 memory. Both models can also now get up to 8TB of SSD storage – which Apple says is the most on a notebook ever – and 64GB of 2666 DDR4 RAM but I’d expect those upgrades to be pricey.

The new power supply delivers an additional 12w of power and there is a new thermal system to compensate for that. The heat pipe that carries air in and out has been redesigned, there are more fan blades on 35% larger fans that move 28% more air compared to the 15” model. 

The fans in the MacBook Pro, when active, put out the same decibel level of sound, but push way more air than before. So, not a reduction in sound, but not an increase either — and the trade is better cooling. Another area where the design process for this MacBook focused on performance gains rather than the obvious sticker copy. 

There’s also a new power brick which is the same physical size as the 15” MacBook Pro’s adapter, but which now supplies 96w up from 87w. The brick is still as chunky as ever and feels a tad heavier, but it’s nice to get some additional power out of it. 

Though I haven’t been able to put the MacBook Pro through any video editing or rendering tests I was able to see live demos of it handling several 8K streams concurrently. With the beefiest internal config Apple says it can usually handle as many as 4, perhaps 5 un-rendered Pro Res streams.

A bigger display, a thicker body

The new MacBook Pro has a larger 16” diagonal Retina display that has a 3072×1920 resolution at 226 ppi. The monitor features the same 500 nit maximum brightness, P3 color gamut and True Tone tech as the current 15”. The bezels of the screen are narrower, which makes it feel even larger when you’re sitting in front of it. This also contributes to the fact that the overall size of the new MacBook Pro is just 2% larger in width and height, with a .7mm increase in thickness. 

The overall increase in screen size far outstrips the increase in overall body size because of those thinner bezels. And this model is still around the same thickness as the 2015 15” MacBook Pro, an extremely popular model among the kinds of people who are the target market for this machine. It also weighs 4.3 lbs, heavier than the 4.02 lb current 15” model.

The display looks great, extremely crisp due to the increase in pixels and even more in your face because of the very thin bezels. This thing feels like it’s all screen in a way that matches the iPad Pro.

This thick boi also features a bigger battery, a full 100Whr, the most allowable under current FAA limits. Apple says this contributes an extra hour of normal operations in its testing regimen in comparison to the current 15” MacBook Pro. I have not been able to effectively test these claims in the time I’ve had with it so far. 

But it is encouraging that Apple has proven willing to make the iPhone 11 Pro and the new MacBook a bit thicker in order to deliver better performance and battery life. Most of these devices are pretty much thin enough. Performance, please.

Speakers and microphone

One other area where the 16” MacBook Pro has made a huge improvement is the speaker and microphone arrays. I’m not sure I ever honestly expected to give a crap about sound coming out of a laptop. Good enough until I put in a pair of headphones accurately describes my expectations for laptop sound over the years. Imagine my surprise when I first heard the sound coming out of this new MacBook and it was, no crap, incredibly good. 

The new array consists of six speakers arranged so that the subwoofers are positioned in pairs, antipodal to one another (back to back). This has the effect of cancelling out a lot of the vibration that normally contributes to that rattle-prone vibrato that has characterized small laptop speakers pretty much forever.

The speaker setup they have here has crisper highs and deeper bass than you’ve likely ever heard from a portable machine. Movies are really lovely to watch with the built-ins, a sentence I have never once felt comfortable writing about a laptop. 

Apple also vents the speakers through their own chambers, rather than letting sound float out through the keyboard holes. This keeps the sound nice and crisp, with a soundstage that’s wide enough to give the impression of a center channel for voice. One byproduct of this though is that blocking one or another speaker with your hand is definitely more noticeable than before.

The quality of sound here is really very, very good. The HomePod team’s work on sound fields apparently keeps paying dividends. 

That’s not the only audio bit that’s better now though, Apple has also put in a 3-mic array for sound recording that it claims has a high enough signal-to-noise ratio that it can rival standalone microphones. I did some testing here comparing it to the iPhone’s mic and it’s absolutely night and day. There is remarkably little hiss present here and artists that use the MacBook as a sketch pad for vocals and other recording are going to get a really nice little surprise here.

I haven’t been able to test it against external mics myself but I was able to listen to rigs that involved a Blue Yeti and other laptop microphones and the MacBook’s new mic array was clearly better than any of the machines and held its own against the Yeti. 

The directional nature of many podcast mics is going to keep them well in advance of the internal mic on the MacBook for the most part, but for truly mobile recording setups the MacBook mic just went from completely not an option to a very viable fallback in one swoop. It really has to be listened to in order to get it. 

I doubt anyone is going to buy a MacBook Pro for the internal mic, but having a ‘pro level’ device finally come with a pro level mic on board is super choice. 

I think that’s most of it, though I feel like I’m forgetting something…

Oh right, the Keyboard

Ah yes. I don’t really need to belabor the point on the MacBook Pro keyboards just not being up to snuff for some time. Whether you weren’t a fan of the short throw on the new butterfly keyboards or you found yourself one of the many people (yours truly included) who ran up against jammed or unresponsive keys on that design — you know that there has been a problem.

The keyboard situation has been written about extensively by Casey Johnston and Joanna Stern and complained about by every writer on Twitter over the past several years. Apple has offered a succession of updates to that keyboard to attempt to make it more reliable and has extended warranty replacements to appease customers. 

But the only real solution was to ditch the design completely and start over. And that’s what this is: a completely new keyboard.

Apple is calling it the Magic Keyboard in homage to the iMac’s Magic Keyboard (but not identically designed). The new keyboard is a scissor mechanism, not butterfly. It has 1mm of key travel (more, a lot more) and an Apple-designed rubber dome under the key that delivers resistance and springback that facilitates a satisfying key action. The new keycaps lock into the keycap at the top of travel to make them more stable when at rest, correcting the MacBook Air-era wobble. 

And yes, the keycaps can be removed individually to gain access to the mechanism underneath. And yes, there is an inverted-T arrangement for the arrow keys. And yes, there is a dedicated escape key.

Apple did extensive physiological research when building out this new keyboard. One test was measuring the effect of a keypress on a human finger. Specifically, they measured the effect of a key on the pacinian corpuscles at the tips of your fingers. These are onion-esque structures in your skin that house nerve endings and they are most sensitive to mechanical and vibratory pressure. 

Apple then created this specialized plastic dome that sends a specific vibration to this receptor making your finger send a signal to your brain that says ‘hey you pressed that key.’ This led to a design that gives off the correct vibration wavelength to return a satisfying ‘stroke completed’ message to the brain.

There is also more space between the keys, allowing for more definitive strokes. This is because the keycaps themselves are slightly smaller. The spacing does take some adjustment, but by this point in the article I am already getting pretty proficient and am having more grief from the autocorrect feature of Catalina than anything else. 

Notably, this keyboard is not in the warranty extension program that Apple is applying to its older keyboard designs. There is a standard 1 year warranty on this model, a statement by the company that they believe in the durability of this new design? Perhaps. It has to get out there and get bashed on by more violent keyboard jockeys than I for a while before we can tell whether it’s truly more resilient. 

But does this all come together to make a more usable keyboard? In short, yes. The best way to describe it in my opinion is a blend between the easy cushion of the old MacBook Air and the low profile stability of the Magic Keyboard for iMac. It’s truly one of the best feeling keyboards they’ve made in years and perhaps ever in the modern era. I reserve the right to be nostalgic about deep throw mechanical keyboards in this regard, but this is the next best thing. 

Pro, or Pro

In my brief and admittedly limited testing so far, the 16” MacBook Pro ends up looking like it really delivers on the Pro premise of this kind of machine in ways that have been lacking for a while in Apple’s laptop lineup. The increased storage caps, bigger screen, bigger battery and redesigned keyboard should make this an insta-buy for anyone upgrading from a 2015 MacBook Pro and a very tempting upgrade for even people on newer models that have just never been happy with the typing experience. 

Many of Apple’s devices with the label Pro lately have fallen into the bucket of ‘the best’ rather than ‘for professionals’. This isn’t strictly a new phenomenon for Apple, but more consumer centric devices like the AirPods Pro and the iPhone Pro get the label now than ever before. 

But the 16” MacBook Pro is going to alleviate a lot of the pressure Apple has been under to provide an unabashedly Pro product for Pro Pros. It’s a real return to form for the real Mack Daddy of the laptop category. As long as this new keyboard design proves resilient and repairable I think this is going to kick off a solid new era for Apple portables.

Review: The iPhone 11 Pro and iPhone 11 do Disneyland after dark

Let’s get this out of the way right up front: iPhone 11’s Night Mode is great. It works, it compares extremely well to other low-light cameras and the exposure and color rendition is best in class, period.

If that does it for you, you can stop right here. If you want to know more about the iPhone 11, augmented photography and how they performed on a trip to the edge of a galaxy far, far away, read on.

As you’re probably now gathering, yes, I took the new iPhones to Disneyland again. If you’ve read my other reviews from the parks,  you’ll know that I do this because they’re they ideal real-world test bed for a variety of capabilities. Lots of people vacation with iPhones.

The parks are hot and the network is crushed. Your phone has to act as your ticket, your food ordering tool, your camera and your map. Not to mention your communication device with friends and family. It’s a demanding environment, plain and simple. And, I feel, a better organic test of how these devices fare than sitting them on a desk in an office and running benchmark tools until they go dead.

I typically test iPhones by using one or two main devices and comparing them with the ones they’re replacing. I’m not all that interested in having the Android vs. iPhone debate because I feel that it’s a bit of a straw man given that platform lock-in means that fewer and fewer people over time are m​aking a truly agnostic platform choice. They’re deciding based on heredity or services (or price). I know this riles the zealots in both camps, but most people just don’t have the luxury of being religious about these kinds of things.

Given the similarities in models, (more on that later) I mainly used the iPhone 11 Pro for my testing, with tests of the iPhone 11 where appropriate. I used the iPhone XS as a reference device. Despite my lack of a desire to do a platform comparison, for this year’s test, given that much discussion has been had about how Google pulled off a coup with the Pixel 3’s Night Sight mode, I brought along one of those as well.

I tried to use the iPhone XS only to compare when comparisons were helpful and to otherwise push the iPhone 11 Pro to handle the full load each day. But, before I could hit the parks, I had to set up my new devices.

iphone11 1 1

Setup

Much of the iPhone 11 Pro’s setup process has remained the same over the years, but Apple has added one new feature worth mentioning: Direct Transfer. This option during setup sits, philosophically, between restoring from a backup made on a local Mac and restoring from an iCloud backup.

Direct Transfer is designed to help users transfer their information directly from one device to another using a direct peer-to-peer connection between the two devices. Specifically, it uses Apple Wireless Direct Link (AWDL), which also powers AirDrop and AirPlay. The transfer is initiated using a particle cloud link similar to the one you see setting up Apple Watch. Once it’s initiated, your old iPhone and new iPhone will be out of commission for up to 2-3 hours depending on how much information you’re transferring. IMG 6590

The data is encrypted in transit. Information directly transferred includes Messages history, full resolution photos that are already stored on your phone and any app data attached to installed apps. The apps themselves are not transferred because Apple’s app signing procedure locks apps to a device, so they must be (automatically) re-dowloaded from the App Store, a process that begins once the Direct Transfer is complete. This also ensures that you’re getting the appropriate version of the app.

Once you’ve done the transfer, the data on your phone is then “rationalized” with iCloud. This helps in cases where you have multiple devices and one of those other devices could have been making changes in the cloud that now need to be updated on the device.

Apple noted that Direct Transfer is good for a few kinds of people:

  • People without an iCloud backup
  • People who have not backed up in a while
  • People in countries where internet speeds are not broadly strong, like China
  • People who don’t mind waiting longer initially for a ‘more complete’ restore

Basically what you’ve got here is a choice between having your iPhone ‘ready’ immediately for basic functionality (iCloud backup restore) and waiting a bit longer to have far more of your personal data accessible from the start, without waiting for iCloud downloads of original photos, Messages history etc.

Direct Transfer also does not transfer Face ID or Touch ID settings, Apple Pay information or Mail Data aside from usernames and passwords.

After iPhone Migration is complete the Messages content from the device will be reconciled the Messages content in iCloud to ensure they are in sync. The same is true for Photos stored in iCloud.

Anecdotally, I experienced a couple of interesting things during my use of Direct Transfer. My first phone took around 2.5 hours to complete, but I still found that the messages archive alerted me that it needed to continue downloading archived messages in the background. Apple suggested that this may be due to this rationalizing process.

I also noticed that when simultaneous Direct Transfer operations were active, side-by-side devices took much longer to complete. This is very likely due to local radio interference. Apple has a solution to that. There is a wired version of the Direct Transfer option using the Camera Connection Kit with a source device and connecting them via USB. Ideally, Apple says, the transfer speeds are identical, but of course the wired option side-steps the wireless interference problem entirely — which is why Apple will be using it for in-store device restores for new iPhones using the Direct Transfer option.

My experience with Direct Transfer wasn’t glitch free, but it was nice having what felt like a ‘more complete’ device from the get go. Of note, Direct Transfer does not appear to transfer all keychain data intact, so you will have to re-enter some passwords.

IMG 2839

Design and Display

I’ve been naked for years. That is, team no case. Cases are annoying to me because of the added bulk. They’re also usually too slippery or too sticky. I often wear technical clothing too and the phones go into slots designed for quick in out or fun party trick things like dropping into your hand with the pull of a clasp. This becomes impossible with most cases.

Apple provided the clear cases for all three iPhones, and I used them to keep them looking decent while I reviewed them, but I guarantee you my personal will never see a case.

I’m happy to report that the iPhone 11 Pro’s matte finish back increases the grippyness of the phone on its own. The smooth back of the iPhone 11 and the iPhone XS always required a bit of finger oil to get into a condition where you could reliably pivot them with one hand going in and out of a pocket.

Traveling through the parks you get sweaty (in the summer), greasy with that Plaza fried chicken and turkey legs and all kinds of kid-related spills. Having the confidence of a case while you’re in these fraught conditions is something I can totally understand. But day-to-day it’s not my favorite.

I do like the unified design identity across the line of making the bump surface blasted glass on the iPhone 11 with a glossy back and then flipping those on the iPhone 11 Pro. It provides a design language link even though the color schemes are different.

IMG 1628

At this point either you’ve bought into the camera bump being functional cool or you hate its guts. Adding another camera is not going to do much to change the opinion of either camp. The iPhone 11 Pro and Pro Max have a distinctly Splinter Cell vibe about them now. I’m sure you’ve seen the jokes about iPhones getting more and more cameras, well, yeah, that’s not a joke.

I think that Apple’s implementation feels about the best it could be here. The iPhone 11 Pro is already thicker than the previous generation, but there’s no way it’s going to get thick enough to swallow a bump this high. I know you might think you want that too, but you don’t.

Apple gave most reviewers the Deep Green iPhone 11 Pro/Max and the minty Green iPhone 11. If I’m being honest, I prefer the mint. Lighter and brighter is just my style. In a perfect world, I’d be able to rock a lavender iPhone 11 Pro. Alas, this is not the way Apple went.

IMG 7942

The story behind the Deep Green, as it was explained to me, begins with Apple’s colorists calling this as a color set to break out over the next year. The fashion industry concurs, to a degree. Mint, seafoam and ‘neon’ greens which were hot early in the year have given way to sage, crocodile and moss. Apple’s Deep Green is also a dark, muted color that Apple says is ideal to give off that Pro vibe.

The green looks nearly nothing like any of the photographs I’ve seen of it on Apple’s site.

In  person, the Deep Green is reads as dark grey in anything but the most direct indoor light. Outdoors, the treated stainless band has an “80’s Mall Green” hue that I actually really like. The back also opens up quite a bit, presenting as far more forest green than it does inside. Overall, though, this is a very muted color that is pretty buttoned up. It sits comfortably alongside neutral-to-staid colors like the Space Gray, Silver and Gold.

The Silver option is likely to be my personal pick this time around just because the frosted white back looks so hot. The first time I won’t have gone gray or black in a while.

IMG 8774

Apple’s new super retina display has a 2M:1 contrast ratio and displays up to 1200 nits in HDR content and 800 in non-HDR. What does this mean out in the sun at the park? Not a whole lot, but the screen is slightly easier to read and see detail on while in sunny conditions. The “extended” portion of Apple’s new XDR screen terminology on the iPhone 11 Pro is due to lux, a luminance metric, not a color metric, so the color gamut remains the same. However, I have noticed that HDR images look slightly flatter on the iPhone XS than they do on the iPhone 11 Pro. The iPhone 11’s screen, while decent, does not compare to the rich blacks and great contrast range of the iPhone 11 Pro. It’s one of two major reasons to upgrade.

Apple’s proprietary 3D touch system has gone the way of the dodo with the iPhone 11. The reasoning behind this was that they realized that they would never be able to ship the system economically or viably on the iPad models. So they canned it in favor of haptic touch, bringing more consistency across the lineup.

IMG 1113

By and large it works fine. It’s a little odd for 3D touch users at first. You retain peek and quick actions but lose pop, for instance, because there’s no additional level of pressure threshold. Most of the actions that you probably commonly use 3D touch for, like the camera or flashlight or home screen app shortcuts work just fine.

I was bullish on 3D touch because I felt there was an opportunity to add an additional layer of context for power users — a hover layer for touch. Unfortunately I believe that there were people at Apple (and outside of it) that were never convinced that the feature was going to be discoverable or useful enough so it never got the investment that it needed to succeed. Or, and I will concede this is a strong possibility, they were right and I was wrong and this just was never going to work.

iphone11 3

Performance and Battery

Apple’s A13 Bionic features efficiency cores that are 20% faster and use 40% less power than the A12 bionic — part of where some impressive battery life improvements come from. Its overall clock speed and benchmarks are up around 20% overall. The performance cores also use 30% less power and the GPU uses 40% less power. The Neural Engine doesn’t escape either and uses 15% lower power. All compared to the iPhone XS.

My focus there on the cores power usage is not to say this feels any less juicy, but all new iPhones feel great out of the box because Apple (usually) works to neatly match power requirements with its hardware. And any previous generation software is going to have plenty of overhead out of the box. No change here.

The biggest direct effect that this silicon work will have on most people’s lives will likely be battery life.

The iPhone 11 Pro has a larger battery than the iPhone XS, with a different higher voltage chemistry. That, coupled with power savings improvements mentioned above, along with more in the screen and other components means better battery life.

My battery tests over several days at the parks point to Apple’s claims about improvements over the iPhone XS being nearly dead on. Apple claims that the iPhone 11 Pro lasts 4 hours longer then the iPhone XS. The iPhone XS came in at roughly 9.5 hours in tests last year and the iPhone 11 Pro came in nearly bang on at 12 hours — in extremely challenging conditions.

It was hot, the networks were congested and I was testing every feature of the camera and phone I could get my hands on. Disneyland has some WiFi in areas of the park, but the coverage is not total, so I relied on LTE for the majority of my usage. This included on-device processing of photos and video (of which I shot around 40 minutes or so each day). It also included using Disney’s frustrating park app, about which I could write a lot of complaints.

I ordered food, browsed twitter while in lines, let the kids watch videos while the wife and I had a necessary glass of wine or six and messaged continuously with family and team members. The battery lasted significantly longer on the iPhone 11 Pro with intense usage than the iPhone XS, which juuuust edged out my iPhone X in last year’s tests.

One of the reasons that I clone my current device and run it that way instead of creating some sort of artificially empty test device is that I believe that is the way that most people will be experiencing the phone. Only weirdos like device testers and Marie Kondo acolytes are likely to completely wipe their devices and start fresh on purchase of a new iPhone.

I’m pretty confident you’ll see an improvement in the battery as well. I’ve done this a lot and these kinds of real world tests at theme parks tend to put more of the kind of strains you’ll see in daily life on the phone than a bench test running an artificial web browsing routine is. On the other hand, maybe you’re a worker at a bot farm and I’ve just touched a nerve. If so, I am sorry.

IMG 5639

Also, an 18W power adapter, the same one that ships with iPad Pro, comes in the iPhone 11 Pro box. Finally, etc. It is very nice having the majority of my cables have at least one end that is USB-C now because I can use multi-port GaN chargers from Anker and power bricks that have USB-C. Apple’s USB-C lightning cables are slightly thicker gauge now, and support data transfer as well as the 18W brick. The bigger charger means faster charging, Apple claims up to 50% charge in 30 minutes with the new charger, which feels about like what I experienced.

It’s quicker, much nicer to top off while nursing a drink and a meatball at the relatively secret upstairs bar at the Wine Country Trattoria in DCA. There’s an outlet behind the counter just ask to use it.

Unfortunately, the iPhone 11 (non pro) still comes with a 5W charger. This stinks. I’d love to see the 18W become standard across the line.

Oh, about that improved FaceID angle — I saw, maybe, a sliiiiiiight improvement, if any. But not that much. A few degrees? Sometimes? Hard to say. I will be interested to see what other reviewers found. Maybe my face sucks.

iphone11 2

Camera and Photography

Once upon a time you could relatively easily chart the path of a photograph’s creation. Light passed through the lens of your camera onto a medium like film or chemically treated paper. A development process was applied, a print was made and you had a photograph.

When the iPhone 8 was released I made a lot of noise about how it was the first of a new wave of augmented photography. That journey continues with the iPhone 11. The ISP that normally takes on the computational tasks associated with color correction and making images look presentable from the raw material the sensor produces. Apple has added the Neural Engines’s machine learning expertise to the pipeline and it’s doing a bunch of things in various modes.

This is what makes the camera augmented on the iPhone 11, and what delivers the most impressive gains of this generation, not new glass, not the new sensors — a processor specially made to perform machine learning tasks.

What we’re seeing in the iPhone 11 is a blended apparatus that happens to include 3 imaging sensors, 3 lenses, a scattering of motion sensors, an ISP, a machine learning tuned chip and a CPU all working in concert to produce 1 image. This is a machine learning camera. But as far as the software that runs iPhone is concerned, It has one camera. In fact, it’s not really a camera at all, it’s a collection of devices and bits of software that work together towards a singular goal: producing an image.

This way of thinking about imaging affects a bunch of features from night mode to HDR and beyond, and the result is the best camera I’ve ever used on a phone.

But first, let’s talk new lenses.

Ultra Wide

Both the iPhone 11 and the iPhone 11 Pro get a new “ultra wide angle” lens that Apple is calling a 13mm. In practice it delivers about what you’d expect from a roughly 13mm lens on a full-frame SLR — very wide. Even with edge correction it has the natural and expected effect of elongating subjects up close and producing some dynamic images. At a distance, it provides options for vistas and panoramic images that weren’t available before. Up close, it does wonders for group shots and family photos, especially in tight quarters where you’re backed up against a wall.

IMG 0610

In my testing of the wide angle, it showed off extremely well especially in bright conditions. It allowed for great close up family shots, wide angle portraits that emphasized dynamism and vistas that really opened up possibilities for shooting that haven’t been on iPhone before.

05 iphone11 sideby5 1

One clever detail here is that when you shoot at 1x or 2x, Apple blends the live view of the wider angle lenses directly into the viewfinder. They don’t just show you the wide with crop marks over it, they are piping in actual feeds from the sensor so that you get a precise idea of how the image might look, while still letting you see that you have other options outside of the frame. It’s the camera viewfinder engineer version of stunting.

04 iphone11 sideby4

I loved shooting people with it up close, but that won’t be for everyone. I’d guess most people will like it for groups and for landscapes. But I found it great to grab fun tight shots of people or really intimate moments that feel so much more personal when you’re in close.

Of note, the ultra wide lens does not have optical image stabilization on either the iPhone 11 or iPhone 11 Pro. This makes it a much trickier proposition to use in low light or at night.

The ultra wide camera cannot be used with night mode because its sensor does not have 100% focus pixels and, of course, no OIS. The result is that wide angle night shots must be held very steady or soft images will result.

The ultra wide coming to both phones is great. It’s a wonderful addition and I think people will get a ton of use out of it on the iPhone 11. If they had to add one, I think adding the UW was the better option because of group shots of people are likely far more common than landscape photographers.

The ultra wide is also fantastic for video. Because of the natural inward crop of video (it uses less of the sensor, so it feels more cramped), the standard wide lens has always felt a little claustrophobic. Taking videos on the carousel riding along with Mary Poppins, for instance, I was unable to get her and Burt in frame at once with the iPhone XS, but was able to with the iPhone 11 Pro. Riding Matterhorn you get much more of the experience and less ‘person’s head in front of you’. Same goes with Cars where the ride is so dominated by the wind screen. I know these are very specific examples, but you can imagine how similar scenarios could play out at family gatherings in small yards, indoors or in other cramped locations.

wide

One additional tidbit about the ultra wide: you may very well have to find a new grip for your phone. The lens is so wide that your finger may show up in some of your shots because your knuckle is in frame. It happened to me a bunch over the course of a few days until I found a grip lower on the phone. iPhone 11 Pro Max users will probably not have to worry.

HDR and Portrait Improvements

Because of those changes to the image pathway I talked about earlier, the already solid HDR images get a solid improvement in portrait mode. The Neural Engine works on all HDR images coming out of the cameras in iPhone to tone map and fuse image data from various physical sensors together to make a photo. It could use pixels from one camera for highlight detail and pixels from another for the edges of a frame. I went over this system extensively back in 2016 and its only gotten more sophisticated since with the addition of the Neural Engine.

It seems to be getting another big leap forward when Deep Fusion launches, but I was unable to test that yet.

For now, we can see additional work that the Neural Engine puts in with Semantic Rendering. This process involves your iPhone doing facial detection on the subject of a portrait, isolating the face and skin from the rest of the scene and applying a different path of HDR processing on it than on the rest of the image. The rest of the image gets its own HDR treatment and then the two images are fused back together.

This is not unheard of in image processing. Most photographers worth their salt will give faces a different pass of adjustments from the rest of an image, masking off the face so that it doesn’t turn out too flat or too contrasty or come out with the wrong skin tones.

The difference here, of course, is that it happens automatically, on every portrait, in fractions of a second.

The results are portraits that look even better on iPhone 11 and iPhone 11 Pro. Faces don’t have the artificially flat look they could sometimes get with the iPhone XS — a result of the HDR process that is used to open up shadows and normalize the contrast of an image.

01 iphone11 sideby1

Look at these two portraits, shot at the same time in the same conditions. The iPhone 11 Pro is far more successful at identifying backlight and correcting for it across the face and head. The result is better contrast ant color, hands down. And this was not an isolated experience, I shot many portrait shots side by side and the iPhone 11 Pro was the pick every time. With especially wide margins if the subject was back lit, which is very common with portraiture.

02 iphone11 sideby2

Here’s another pair, the differences are more subtle here but look at the color balance between the two. The skin tones are warmer, more olive and (you’ll have to trust me on this one) truer to life on the iPhone 11 Pro.

Highkey

And yes, the High Key Mono works, but is still not perfect.

Night Mode

Now for the big one. The iPhone 11 finally has a Night Mode. Though I wouldn’t really call it a mode because it doesn’t actually require that you enable it, it just kicks in automatically when it thinks it can help.

On a technical level, Night Mode is a function of the camera system that strongly resembles HDR. It does several things when it senses that the light levels have fallen below a certain threshold.

  1. It decides on a variable number of frames to capture based on the light level, the steadiness of the camera according to the accelerometer and other signals.
  2. The ISP then grabs these bracketed shots, some longer, some shorter exposure.
  3. The Neural Engine is relatively orthogonal to Night Mode working, but it’s still involved because it is used for semantic rendering across all HDR imaging in iPhone 11.
  4. The ISP then works to fuse those shots based on foreground and background exposure and whatever masking the Neural Engine delivers.

The result is a shot that brightens dark-to-very-dark scenes well enough to change them from throw away images to something well worth keeping. In my experience, it was actually difficult to find scenes dark enough to make the effect intense enough. The new 33% improvement in ISO in the wide camera and 42% improvement on telephoto on iPhone XS already help a lot.

night mode

But once you do find the right scene, you see detail and shadow pop and it becomes immediately evident even before you press the shutter that it is making it dramatically brighter. Night Mode works only in 1x and 2x shooting modes because only those cameras have the 100% focus pixels needed to do the detection and mapping that the iPhone 11 needs to make the effect viable.

IMG 4090 1

I have this weird litmus test I put every new phone camera through where I take it on a dark ride, like Winnie the Pooh, to see if I can get any truly sharp usable image. It’s a great test because the black light is usually on, the car is moving and the subject is moving. Up until this point I have succeeded exactly zero times. But the iPhone 11 Pro pulled it off. Not perfect, but pretty incredible all things considered.

A few observations about Night Mode:

  • The night images still feel like night time. This is the direct result of Apple making a decision not to open every shadow and brighten every corner of an image, flaring saturation and flattening contrast.
  • The images feel like they have the same genetic makeup as an identical photo taken without night mode. They’re just clearer and the subject is brighter.
  • Because of the semantic mapping working on the image, along with other subject detection work, the focal point of the image should be clearer/brighter, but the setting and scene does not all come up at once like a broad gain adjustment.
  • iPhone 11, like many other ‘night modes’ across phones, has issues with moving subjects. It’s best if no one is moving or they are moving only very slightly. This can vary depending on the length of exposure from 1-3 seconds.
  • On a tripod or another stationary object, Night Mode will automatically extend up to a 10 second exposure. This allows for some great night photography effects like light painting or trailing.

The result is naturally bright images that retain a fantastic level of detail while still feeling like they have natural color that is connected to the original subject matter.

Back when the Pixel 3 shipped Night Sight I noted that choosing a gain-based night mode had consequences, and that Apple likely could ship something based on pure amperage but that it had consistently made choices to do otherwise and would likely do so for whatever it shipped. People really hated this idea, but it holds up exactly.

Though the Galaxy 10+ has a great night mode as well, the Pixel 3 was the pioneer here and still jumps to mind when judging night shots. The choices Google has made here are much more in the realm of ‘everything brighter’. If you love it, you love it, and that’s fine. But it is absolutely not approaching this from a place of restraint.

06 ithorian night

Here are some examples of the iPhone 11 Pro up against images from the Pixel 3. As you can see, both do solid work brightening the image, but the Pixel 3 is colder, flatter and more evenly brightened. The colors are not representative at all.

falcon iphone vs pixel

In addition, whatever juice Google is using to get these images out of a single camera and sensor, it suffers enormously on a detail level. You can see the differences here in the rock work and towers. It’s definitely better than having a dark image, but it’s clear that the iPhone 11 Pro is a jump forward.

The Pixel 4 is around the corner, of course, and I can’t wait to see what improvements Google comes up with. We are truly in a golden age for taking pictures of dark shit with phone cameras.

Of note, the flash is now 36% brighter than the iPhone XS, which is a nice fallback for moving subjects.

iphone11 4

Tidbits

Auto crop

The iPhone 11 will, by default, auto crop subjects back into your videos shot at 1x or 2x. If you’re chasing your kid and his head goes out of frame, you could see an auto button on the 1 up review screen after a bit of processing. Tapping this will re-frame your video automatically. Currently this only works with the QuickTake function directly from the iPhone’s lock screen. It can be toggled off.

You can toggle on auto cropping for photos in the Camera settings menu if you wish, it is off by default. This has a very similar effect. It’s using image analysis to see if it has image data that it can use to re-center your subject.

Slofies

Yeah, they’re fun, yeah, they work. They’re going to be popular for folks with long hair.

U1

Apple has included a U1 chip in the iPhone 11 – can’t test it but it’s interesting as hell. Probably best to reserve talking about this extensively for a bit as Apple will ship the U1’s first iPhone functionality with a directional…AirDrop feature? This is definitely one of those things where future purposes, tile-like locator perhaps, were delayed for some reason and a side project of the AirDrop team got elevated to first ship. Interestingly, Apple mentioned, purely as an example, that this feature could be used to start car ignitions given the appropriate manufacturer support.

If this sounds familiar, then you’ve probably read anything I’ve written over the last several years. It’s inevitable that iPhones and Apple Watches begin to take on functionality like this, it’s just a matter of how to do it precisely and safely. The U1 has a lot to do with location on a micro-level. It’s not broad, network based or GPS based location, it’s precise location and orientation. That opens up a bunch of interesting possibilities.

IMG 1475

No Night Mode

IMG 1588

Night Mode

About that Pro

And then there was the name. iPhone 11 Pro. When I worked at a camera shop, you learned the power of the word “pro”. For some people it was an aphrodisiac, for others, a turn off. And for others, it was simply a necessity.

Is this the pro model? Oh I’m not a pro. Oooh, this is the pro!

We used it as a sales tool, for sure. But every so often it was also necessary to use it to help prevent people from over-buying or under-buying for their needs.

In the film days one of the worst things you could ever shoot as a pro-am photographer was gym sports. It was fast action, inside where it’s comparatively dim, and at a distance from court-side. There was no cheap way to do it. No cranking the ISO to 64,000 and letting your camera’s computer clean it up. You had to get expensive glass, an expensive camera body to operate that glass and an expensive support like a monopod. You also had to not be a dumbass (this was the most expensive part).

Amateurs always balked at the barrier of entry to shooting in these kinds of scenarios. But the real pros knew that for every extra dollar they spent on the good stuff, they’d make it up ten fold in profits because they could deliver product no parent with a point and shoot could hope to replicate.

However, the vast majority of people that walked into the shop weren’t shooting hockey or wrestling. They were taking family photos, outdoor pics and a few sunsets.

Which brings us to what the term Pro means now: Pro is about edge cases.

It’s not about the 80% case, it’s about the 20% of people that need or want something more out of their equipment.

For this reason, the iPhone 11 is going to sell really well. And it should because it’s great. It has the best new lens, an ultra wide that takes great family photos and landscape shots. It has nearly every software feature of iPhone 11 Pro. But it doesn’t have the best screen and it doesn’t have telephoto. For people that want to address edge cases – the best video and photo options, a better dark mode experience, a brighter screen — the iPhone 11 Pro is there — for everyone else, there’s still fiscal 2020’s best selling iPhone.

IMG 1820

Yep, Night Mode

I hope Apple Arcade makes room for weird cool shit

Apple Arcade seems purpose built to make room in the market for beautiful, sad, weird, moving, slow, clever and heartfelt. All things that the action, shooter and MOBA driven major market of games has done nothing to foster over the last decade.

I had a chance to play a bunch of the titles coming to Apple Arcade, which launched today in a surprise move for some early testers of iOS 13. Nearly every game I played was fun, all were gorgeous and some were really really great.

A few I really enjoyed, in no particular order:

20190524 WCF GameplayScreenshot wcf screenShot mcFishShakeJump 1080

Where Cards Fall — A Snowman game from Sam Rosenthal. A beautiful game with a clever card-based mechanic that allows room for story moments and a ramping difficulty level that should be fantastic for short play sessions. Shades of Monument Valley, of course, in its puzzle + story interleave and it its willingness to get super emotional about things right away. More of this in gaming! Super satisfying gameplay and crisp animations abound.

20190729 Overland GameplayScreenshot 09 Basin

Overland — Finji — Overland is one of my most anticipated games from the bunch, I’ve been following the development of this game from the Night in the Woods and Canabalt creators for a long time. It does not disappoint, with a stylized but somehow hyper-realized post apocalyptic turn-based system that transmits urgency through economy of movement. Every act you take counts. Given that it’s a rogue like, the story is told through the world rather than through an individual character’s narrative and the world does a great job of it.

20190517 Oceanhorn2 Oceanhorn2 Screenshot 7

Oceanhorn 2 — Cornfox & Brothers — The closest to a native Zelda you’ll get on iOS — this plays great on a controller. Do yourself a favor and try it that way.

20190712 Spek GameplayScreenshot Spek Screen C 3

Spek — RAC7 — One of those puzzle games people will plow through, it makes the mechanics simple to understand then begins to really push and prod at your mastery of them over time. The AR component of the app seems like it will be a better party game than solo experience, but the effects used here are great and it really plays with distance and perspective in a way that an AR game should. A good totem for the genre going forward.

I was able to play several of the games across all three platforms including Apple TV with an Xbox controller, iPhone and iPad. While some favored controller (Skate City) and others touch controls (Super Impossible Road), all felt like I could play them either way without much difficulty.

There are also some surprises in the initial batch of games like Lego Brawls — a Smash Brothers clone that will be a big hit for car rides and get togethers I think.

My hope is that the Apple Arcade advantage, an agressive $4.99 price and prime placement in the App Store, may help to create an umbrella of sorts for games that don’t fit the ‘big opening weekend’ revenue mold and I hope Apple leans into that. I know that there may be action-oriented and big name titles in the package now and in the future, and that’s fine. But there are many kinds of games out there that are fantastic but “minor” in the grand scheme of things and having a place that could create sustainability in the market for these gems is a great thing.

The financial terms were not disclosed by Apple but many of the developers appear to have gotten up front money to make games for the platform and, doubtless, there is a rev share on some sort of basis, probably usage or installs. Whatever it is, I hope the focus is on sustainability, but the people responsible for Arcade inside Apple are making all the right noises about that so I have hopes.

I am especially glad that Apple is being aggressive with the pricing and with the restrictions it has set for the store, including no in-app purchases or ads. This creates an environment where a parent (ratings permitting) can be confident that a kid playing games from the Arcade tab will not be besieged with casino ads in the middle of their puzzle game.

There is, however, a general irony in the fact that Apple had to create Apple Arcade because of the proliferation of loot box/currency/in-app purchase revenue models. An economy driven by the App Store’s overall depressive effect on the price of games and the decade long acclimation people have had to spending less and less, down to free, for games and apps on the store.

By bundling them into a subscription, Apple sidesteps the individual purchase barrier that it has had a big hand in creating in the first place. While I don’t think it is fully to blame — plenty of other platforms aggressively promote loot box mechanics — a big chunk of the responsibility to fix this distortion does rest on Apple. Apple Arcade is a great stab at that and I hope that the early titles are an indicator of the overall variety and quality that we can expect.

Apple tweaks App Store rule changes for children’s apps and sign in services

Originally announced in June, changes to Apple’s App Store policies on its Sign in with Apple service and the rules around children’s app categories are being tweaked. New apps must comply right away with the tweaked terms, but existing apps will have until early 2020 to comply with the new rules.

The changes announced at Apple’s developer conference in the summer were significant, and raised concerns among developers that the rules could handicap their ability to do business in a universe that, frankly, offers tough alternatives to ad-based revenue for children’s apps.

In a short interview with TechCrunch, Apple’s Phil Schiller said that they had spent time with developers, analytics companies and advertising services to hear what they had to say about the proposals and have made some updates.

The changes are garnering some strong statements of support from advocacy groups and advertising providers for children’s apps that were pre-briefed on the tweaks. The changes will show up as of this morning in Apple’s developer guidelines.

“As we got closer to implementation we spent more time with developers, analytics companies and advertising companies,” said Schiller. “Some of them are really forward thinking and have good ideas and are trying to be leaders in this space too.”

With their feedback, Schiller said, they’ve updated the guidelines to allow them to be more applicable to a broader number of scenarios. The goal, he said, was to make the guidelines easy enough for developers to adopt while being supportive of sensible policies that parents could buy into. These additional guidelines, especially around the Kids app category, says Schiller, outline scenarios that may not be addressed by the Children’s Online Privacy Protection Act (COPPA) or GDPR regulations.

There are two main updates.

Kids changes

The first area that is getting further tweaking is the Kids terms. Rule sections 1.3 and 5.1.4 specifically are being adjusted after Apple spoke with developers and providers of ad and analytics services about their concerns over the past few months.

Both of those rules are being updated to add more nuance to their language around third-party services like ads and analytics. In June, Apple announced a very hard-line version of these rule updates that essentially outlawed any third-party ads or analytics software and prohibited any data transmission to third-parties. The new rules offer some opportunities for developers to continue to integrate these into their apps, but also sets out explicit constraints for them.

The big changes come in section 1.3 surrounding data safety in the Kids category. Apple has removed the explicit restriction on including any third-party advertising or analytics. This was the huge hammer that developers saw heading towards their business models.

Instead, Apple has laid out a much more nuanced proposal for app developers. Specifically, it says these apps should not include analytics or ads from third parties, which implicitly acknowledging that there are ways to provide these services while also practicing data safety on the App Store.

Apple says that in limited cases, third-party analytics may be permitted as long as apps in the Kids category do not send personal identifiable information or any device fingerprinting information to third parties. This includes transmitting the IDFA (the device ID for advertisers), name, date of birth, email address, location or any other personally identifiable information.

Third-party contextual ads may be allowed but only if those companies providing the ads have publicly documented practices and policies and also offer human review of ad creatives. That certainly limits the options, including most offerings from programmatic services.

Rule 5.1.4 centers on data handling in kids apps. In addition to complying with COPPA, GDPR and other local regulations, Apple sets out some explicit guard rails.

First, the language on third-party ads and analytics has been changed from may not to should not. Apple is discouraging their use, but acknowledges that “in limited cases” third-party analytics and advertising may be permitted if it adheres to the new rules set out in guideline 1.3.

The explicit prohibition on transmitting any data to third parties from apps in the Kids category has been removed. Once again, this was the big bad bullet that every children’s app maker was paying attention to.

An additional clause reminds developers not to use terms like “for kids” and “for children” in app metadata for apps outside of the Kids category on the App Store.

SuperAwesome is a company that provides services like safe ad serving to kids apps. CEO Dylan Collins was initially critical of Apple’s proposed changes, noting that killing off all third-party apps could decimate the kids app category.

“Apple are clearly very serious about setting the standard for kids apps and digital services,” Collins said in a statement to TechCrunch after reviewing the new rules Apple is publishing. “They’ve spent a lot of time working with developers and kidtech providers to ensure that policies and tools are set to create great kids digital experiences while also ensuring their digital privacy and safety. This is the model for all other technology platforms to follow.”

All new apps must adhere to the guidelines. Existing apps have been given an additional six months to live in their current form but must comply by March 3, 2020.

“We commend Apple for taking real steps to protect children’s privacy and ensure that kids will not be targets for data-driven, personalized marketing,” said Josh Golin, Executive Director of Campaign for Commercial-Free Childhood. “Apple rightly recognizes that a child’s personal identifiable information should never be shared with marketers or other third parties. We also appreciate that Apple made these changes on its own accord, without being dragged to the table by regulators.”

The CCFC had a major win recently when the FTC announced a $170M fine against YouTube for violations of COPPA.

Sign in with Apple

The second set of updates has to do with Apple’s Sign in with Apple service.

Sign in with Apple is a sign-in service that can be offered by an app developer to instantly create an account that is handled by Apple with additional privacy for the user. We’ve gone over the offering extensively here, but there are some clarifications and policy additions in the new guidelines.

Sign in with Apple is being required to be offered by Apple if your app exclusively offers third-party or social log ins like those from Twitter, Google, LinkedIn, Amazon or Facebook. It is not required if users sign in with a unique account created in the app, with say an email and password.

But some additional clarifications have been added for additional scenarios. Sign in with Apple will not be required in the following conditions:

  • Your app exclusively uses your company’s own account setup and sign-in systems.
  • Your app is an education, enterprise or business app that requires the user to sign in with an existing education or enterprise account.
  • Your app uses a government or industry-backed citizen identification system or electronic ID to authenticate users.
  • Your app is a client for specific third-party service and users are required to sign in to their mail, social media or other third-party account directly to access their content.

Most of these were sort of assumed to be true but were not initially clear in June. The last one, especially, was one that I was interested in seeing play out. This scenario applies to, for instance, the Gmail app for iOS, as well as apps like Tweetbot, which log in via Twitter because all they do is display Twitter.

Starting today, new apps submitted to the store that don’t meet any of the above requirements must offer Sign in with Apple to users. Current apps and app updates have until April 2020 to comply.

Both of these tweaks come after developers and other app makers expressed concern and reports noted the abruptness and strictness of the changes in the context of the ever-swirling anti-trust debate surrounding big tech. Apple continues to walk a tightrope with the App Store where they flex muscles in an effort to enhance data protections for users while simultaneously trying to appear as egalitarian as possible in order to avoid regulatory scrutiny.

Apple is turning Siri audio clip review off by default and bringing it in house

The top line news is that Apple is making changes to the way that Siri audio review, or ‘grading’ works across all of its devices. First, it is making audio review an explicitly opt-in process in an upcoming software update. This will be applicable for every current and future user of Siri.

Second, only Apple employees, not contractors, will review any of this opt-in audio in an effort to bring any process that uses private data closer to the company’s core processes.

Apple has released a blog post outlining some Siri privacy details that may not have been common knowledge as they were previously described in security white papers.

Apple apologizes for the issue.

“As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize. As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users — but only after making the following changes…”

It then outlines three changes being made to the way Siri grading works.

  • First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
  • Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
  • Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

Apple is not implementing any of these changes, nor is it lifting the suspension on the Siri grading process that it halted until the software update becomes available for its operating systems that will allow users to opt in. Once people update to the new versions of its OS, they will have the chance to say yes to the grading process that uses audio recordings to help verify requests that users make of Siri. This effectively means that every user of Siri will be opted out of this process once the update goes live and is installed.

Apple says that it will continue using anonymized computer generated written transcripts of your request to feed its machine learning engines with data, in a fashion similar to other voice assistants. These transcripts may be subject to Apple employee review.

Amazon and Google had previous revelations that their assistants were being helped along by human review of audio, and they have begun putting opt-ins in place as well.

Apple is making changes to the grading process itself as well, noting that, for example, “the names of the devices and rooms you setup in the Home app will only be accessible by the reviewer if the request being graded involves controlling devices in the home.”

A story in The Guardian in early August outlined how Siri audio samples were sent to contractors Apple had hired to evaluate the quality of responses and transcription that Siri produced for its machine learning engines to work on. The practice is not unprecedented, but it certainly was not made as clear as it should have been in Apple’s privacy policies that humans were involved in the process. There was also the matter that contractors, rather than employees, were being used to evaluate these samples. One contractor described as containing sensitive and private information that, in some cases, may have been able to be tied to a user, even with Apple’s anonymizing processes in place.

In response, Apple halted the grading process worldwide while it reviewed the process. This post and updates to its process are the result of that review.

Apple says that around 0.2% of all Siri requests got this audio treatment in the first place, but given that there are 15B requests per month, the quick maths tell us that though it is statistically insignificant, the raw numbers could be quite high.

The move away from contractors was signaled by Apple releasing employees in Europe, as noted by Alex Hearn earlier on Wednesday.

Apple is also publishing an FAQ on how Siri’s privacy controls fit in with its grading process, you can read that in full here.

The blog post from Apple and the FAQ provide some details to consumers about how Apple handles the grading process, how it is minimizing the data given to data reviewers in the grading process and how Siri privacy is preserved.

Apple’s work with Siri from the beginning has focused enormously on on-device processing whenever possible. This has led a lot of experts to say that Apple was trading raw capability for privacy by eschewing the data-center heavy processes of assistants from companies like Amazon or Google in favor of keeping a ‘personal cloud’ of data on device. Sadly, the lack of transparency on human review processes and the use of contractors undercut all of this foundational work Apple has been doing from the beginning. So it’s good that Apple is cranking all the way back to past industry standard on its privacy policies regarding grading and improvement. That is where it needs to be.

The fact is that no other assistant product is nearly as privacy focused as Siri — as I said above, some would say to the point of hampering its ability to advance as quickly. Hopefully this episode leads to better transparency on the part of Apple when humans get involved in processes that are presumed to be fully automated.

Most people assume that ‘AI’ or ‘machine learning’ mean computers only, but the sad fact is that most of those processes are intensely human driven still because AI (which doesn’t really exist) and ML are still pretty crap. Humans will be involved in making them seem smarter for a very long time yet.

Apple rolls out Apple Card Preview to select users

Apple Card is getting its first group of public test users today. A limited amount of customers that signed up to be notified about the release of Apple Card are getting the ability to apply for the card in their Wallet app today — as well as the option to order their physical Apple Card.

I’ve been using the card for a few days on my own device, making purchases and payments and playing around with features like Apple Cash rewards and transaction categorization.

A full rollout of Apple Card will come later in August. It requires iOS 12.4 and up to operate.

The application process was simple for me. Portions of the information you need are pre-filled from your existing AppleID account, making for less manual entry. I had an answer in under a minute and was ready to make my first purchase instantly. I used it both online and in person with contactless terminals.

It…works.

The card on the screen has a clever mechanism that gives you a sort of live heat map of your spending categories. The color of the card will shift and blend according to the kinds of things you buy. Spend a lot at restaurants and the card will take on an orange hue. Shop for entertainment related items and the card shifts into a mix of orange and pink. It’s a clever take on the chart based spending category features many other credit cards have built into their websites.

IMG 0676

As many have pointed out, if you’re the kind of person that maximizes your points on current cards towards super specific rewards, like travel miles, the rewards system of Apple Card will not feel all that impressive. This is by design. Apple’s aim on this initial offering was to provide the most representational and easy to understand reward metric, rather than to provide top of category points returns.

But it also means that this may not be the card for you if you’re a big travel points maximizer.

I am a points person, and I carry several cards with differing rewards returns and point values depending on what I’m trying to accomplish. Leveraging these cards has allowed me to secure upgrades to higher classes, first class flights for family members and more due to how much I travel. Getting to this point, though, required a crash course in points values, programs and a tight grip on what cards to use when. Shout out to TPG.

IMG 0677

You will not be able to leverage Apple’s card in this way as a frequent traveler. Instead, Apple decided on a (by comparison) transparent rewards methodology: cash back based on a percentage of your purchases in 3 categories.

Those categories are 3% on all purchases from Apple Stores, the App Store and Apple subscriptions, 2% daily cash on any Apple Pay purchase and 1% with the physical card either online or offline.

The cash rewards are delivered daily, and made available to you very quickly on your Apple Cash card balance. Usually in less than a day. You can then do an instant transfer to your bank for a maximum $10 fee or a 1-3 day transfer for free. This cashout is faster than just about any other cash back program out there and certainly way faster cash reward tallying than anyone else. And Apple makes no effort to funnel you into a pure statement credit version of cash back, like many other cards do. The cash becomes cash pretty instantly.

I could easily see the bar Apple sets here — daily rewards tallies and instant cashouts — becoming industry standard.

The card interface itself is multiples better to use than most card apps, with the new Amex apps probably coming the closest. But even those aren’t system level, requiring no additional usernames and passwords. Apple Card has a distinct advantage there, one that Apple I’m sure hopes to use to the fullest. This is highlighted by the fact that the Apple Card application option is present on the screen any time you add a new credit card or debit card to Apple Pay now. Top of mind.

The spending categories and clear transaction names (with logos in many cases) are a very welcome addition to a credit card interface. The vast majority of time with even the best credit card dashboards you are presented with super crappy list of junk that includes a transaction identifier and a mangled vendor ID that could or could not map directly to the name of the actual merchant you purchased from. It makes deciphering what a specific transaction was for way harder then it should be. Apple Card parses these by vendor name, website name and then whatever it can parse on its own before it defaults back to the raw identifier. Way easier.

IMG 0681

A note, during the setup process the card will ask you if you want it to be your payment default for everything Apple and will automatically attach to your Apple stuff like App Store and subscription payments. So keep an eye out for that and make a call. You will get 3% cash back on any apps you buy, of course, even if they’re third party.

The payments interface is also unique in that Apple is pushing very hard to help you not pay interest. It makes recommendations on how to pay chunks of your balance over time before you incur interest. It places 1-3 markers on the circle-shaped interface that show you how much you need to pay off minimum, minimum with no interest and in full. These markers are personalized a bit and can vary depending on balance, due date and payment history.

I really dug hard on how Apple Card data was being handled the last time I wrote about the service, so you should read that for more info. Goldman Sachs is the partner for the card but it absolutely cannot use the data it gathers on transactions via Apple Card for market maps, as chunks of anonymized data it can offer partners about spending habits or any of the typical marketing uses credit card processors get up to. Mastercard and Goldman Sachs can only use the data for operations uses. Credit reporting, remittance, etc.

And Apple itself neither collects nor views anything about where you shopped, what you bought or how much you paid.

And, as advertised, there are no fees with Apple Card.

One thing I do hope that Apple Card adds is an ability to see and filter out recurring payments and subscriptions. This fits with the fiscal responsibility theme it’s shooting for with the payments interface and it’s sorely lacking in most first party apps.

Some nice design touches beyond the transaction maps, the color grading that mirrors purchases and the far more readable interface is a pleasant metallic sheen that is activated on device tilt.

My physical card isn’t here yet so I can’t really evaluate that part of it. But it is relatively unique in that it is nearly featureless, with no printed number, expiration, signature or security codes on its surface.

The titanium Apple Card comes in a package with an NFC chip that allows you to simply tap your phone to the envelope to begin the process of activating your card. No phone numbers to call and, heavens forbid, no 1-800 stickers on the surface of the card.

I can say that this is probably the first experience most people will ever have with a virtual credit card number. The physical card has a ‘hard coded’ number that cannot be changed. You never need to know it because it’s only used in in-person transactions. If it ever gets compromised, you can request a new card and freeze the old one in the app.

For online purchases that do not support Apple Pay, you have a virtual card number in the Wallet app. You enter that number just as you would any other card number and it’s automatically added to your Safari auto-fill settings when you sign up for Apple Card.

The advantage to this, of course, is that if it’s ever compromised, you can hit a button to request an entirely new number right from within the app. Notably, this is not a ‘per transaction’ number — it’s a semi-permanent virtual number. You keep it around until you have an issue. But when you do have a problem, you’ve got a new number instantly, which is far superior to having to wait for a new physical card just to continue making online purchases.

Some banks like Bank of America and Citibank already offer virtual options for online purchases, and third party services like Privacy.com also exist. But this is the beginning of the mainstreaming of VCCs. And it’s a good thing.

Apple suspends Siri response grading in response to privacy concerns

In response to concerns raised by a Guardian story last week over how recordings of Siri queries are used for quality control, Apple is suspending the program world wide. Apple says it will review the process that it uses, called grading, to determine whether Siri is hearing queries correctly, or being invoked by mistake.

In addition, it will be issuing a software update in the future that will let Siri users choose whether they participate in the grading process or not. 

The Guardian story from Alex Hern quoted extensively from a contractor at a firm hired by Apple to perform part of a Siri quality control process it calls grading. This takes snippets of audio, which are not connected to names or IDs of individuals, and has contractors listen to them to judge whether Siri is accurately hearing them — and whether Siri may have been invoked by mistake.

“We are committed to delivering a great Siri experience while protecting user privacy,” Apple said in a statement to TechCrunch. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

The contractor claimed that the audio snippets could contain personal information, audio of people having sex and other details like finances that could be identifiable, regardless of the process Apple uses to anonymize the records. 

They also questioned how clear it was to users that their raw audio snippets may be sent to contractors to evaluate in order to help make Siri work better. When this story broke, I dipped into Apple’s terms of service myself and, though there are mentions of quality control for Siri and data being shared, I found that it did fall short of explicitly and plainly making it clear that live recordings, even short ones, are used in the process and may be transmitted and listened to. 

The figures Apple has cited put the amount of queries that may be selected for grading under 1 percent of daily requests.

The process of taking a snippet of audio a few seconds long and sending it to either internal personnel or contractors to evaluate is, essentially, industry standard. Audio recordings of requests made to Amazon and Google assistants are also reviewed by humans. 

An explicit way for users to agree to the audio being used this way is table stakes in this kind of business. I’m glad Apple says it will be adding one. 

It also aligns better with the way that Apple handles other data like app performance data that can be used by developers to identify and fix bugs in their software. Currently, when you set up your iPhone, you must give Apple permission to transmit that data. 

Apple has embarked on a long campaign of positioning itself as the most privacy conscious of the major mobile firms and therefore holds a heavier burden when it comes to standards. Doing as much as the other major companies do when it comes to things like using user data for quality control and service improvements cannot be enough if it wants to maintain the stance and the market edge that it brings along with it.

AI photo editor FaceApp goes viral again on iOS, raises questions about photo library access

FaceApp. So. The app has gone viral again after first doing so two years ago or so. The effect has gotten better but these apps, like many other one off viral apps, tend to come and go in waves driven by influencer networks or paid promotion. We first covered this particular AI photo editor  from a team of Russian developers about two years ago.

It has gone viral again now due to some features that allow you to edit a person’s face to make it appear older or younger. You may remember at one point it had an issue because it enabled what amounted to digital blackface by changing a person from one ethnicity to another.

In this current wave of virality, some new rumors are floating about FaceApp. The first is that it uploads your camera roll in the background. We found no evidence of this and neither did security researcher and Guardian App CEO Will Strafach or researcher Baptiste Robert.

The second is that it somehow allows you to pick photos without giving photo access to the app. You can see a video of this behavior here:

While the app does indeed let you pick a single photo without giving it access to your photo library, this is actually 100% allowed by an Apple API introduced in iOS 11. It allows a developer to let a user pick one single photo from a system dialog to let the app work on. You can view documentation here and here.

IMG 54E064B28241 1

Because the user has to tap on one photo, this provides something Apple holds dear: user intent. You have explicitly tapped it, so it’s ok to send that one photo. This behavior is actually a net good in my opinion. It allows you to give an app one photo instead of your entire library. It can’t see any of your photos until you tap one. This is far better than committing your entire library to a jokey meme app.

Unfortunately, there is still some cognitive dissonance here, because Apple allows an app to call this API even if a user has set the Photo Access setting to Never in settings. In my opinion, if you have it set to Never, you should have to change that before any photo can enter the app from your library, no matter what inconvenience that causes. Never is not a default, it is an explicit choice and that permanent user intent overrules the one-off user intent of the new photo picker.

I believe that Apple should find a way to rectify this in the future by making it more clear or disallowing if people have explicitly opted out of sharing photos in an app.

IMG 0475

One good idea might be the equivalent of the ‘only once’ location option added to the upcoming iOS 13 might be appropriate.

One thing that FaceApp does do, however, is it uploads your photo to the cloud for processing. It does not do on-device processing like Apple’s first party app does and like it enables for third parties through its ML libraries and routines. This is not made clear to the user.

I have asked FaceApp why they don’t alert the user that the photo is processed in the cloud. I’ve also asked them whether they retain the photos.

Given how many screenshots people take of sensitive information like banking and whatnot, photo access is a bigger security risk than ever these days. With a scraper and optical character recognition tech you could automatically turn up a huge amount of info way beyond ‘photos of people’.

So, overall, I think it is important that we think carefully about the safeguards put in place to protect photo archives and the motives and methods of the apps we give access to.

The Great Hack was one of the wildest movies I saw at Sundance

The trailer is out for Netflix doc The Great Hack, an early cut of which was screened at Sundance this year. I saw that cut during the fest and it was one of the wildest of a second wave of films trying to make sense of what the hell happened with Facebook and the election. A year ago, the tone was different. It was more shock and awe and impressionist art pieces. The Great Hack is part of a new breed that is making a serious attempt to put things into a narrative that normals can understand.

The film anchors itself mostly on two figures, Parsons School of Design Professor David Caroll and ex-Cambridge Analytica employee and ostensible whistleblower Brittany Kaiser, with a cast of other touchstone figures like Guardian journalist Carole Cadwalladr.

One of the major weaknesses of this kind of story is that it is likely best told in minutes of product meetings and repo commits, rather than attached to human narrative. But that’s not how most humans think and the past ten years have proven that even the people charged with protecting users from these systems have very little idea about how they actually work or how vulnerable they were and continue to be to manipulation. So The Great Hack takes an earnest stab at laying out the basics of how Facebook and other online platforms were manipulated and compromised in order to fuel Cambridge Analytica’s manipulation machine and, by extension, election campaigns and other public sentiment scenarios.

The version I saw did its best to connect these topics with tissue that (mostly, but not always) feels like it is linking the events with human counterparts involved. It does paint some of the journalists and figures in the piece with a bit of a golden brush, and never goes much further than ambivalence when featuring Kaiser, who was by her own admission, right alongside Cambridge Analytica CEO Alexander Nix, (who plays the villain of the piece (IRL as well as in the doc)) through CA’s most controversial period.

But, if you’ve been following the whole saga and reading news obsessively, not much in here is going to feel like brand new information. It is likely, though that there will be plenty that is new to a broader Netflix audience. If they were able to fix some of the pacing issues and land some of the ‘revelations’ with more punch in the final version I think it may have legs.

The doc hits Netflix on July 24th. You should check it out for yourself.