Three weeks after Apple unveiled its new MacBook Pro laptops, orders for virtually all models still extend out for weeks — as they have pretty much from the start. If you didn’t order right away on Oct. 18 or manage to snag one in an Apple store after Oct. 26, you’ll be waiting a while.
Supply chain issues are part of that problem, but the long-awaited transition from Intel’s chips to Apple’s custom ARM-based system-on-a-chip also highlighted the built-up in demand for true pro-level hardware. And make no mistake, the M1 Pro and M1 Max chips are pretty much everything Apple users would have wanted, sporting both high-performance/high-efficiency cores, up to 64GB of RAM, a variety of GPU core options, and benchmarks that basically shout Apple’s underlying message to the industry as a whole: keep up.
There’s a reason the Apple event was called “Unleashed.”
The performance/power problem
For years, there were rumblings that Apple execs weren’t particularly happy with the power/performance ratio of Intel’s processors. The performance-per-watt ratio was never quite good enough. Intel chips that ran cool enough to use in a MacBook Air, for instance, were never considered high-performers. And the company’s more powerful chips consumed too much power and generated the heat to match. (That’s a bad combo for laptops.) Each chip from Intel forced a compromise Apple engineers had to design around, leading to Apple’s own foray into chip design.
The M1 Pro and M1 Max improve on the already impressive M1 chip that debuted last year in the 13-in. MacBook Pro and MacBook Air. (Apple still sells the smaller MacBook Pro, but it lives in the shadows now of its big brothers.) The new 14-in. and 16-in. MacBook Pros are meant specifically for business and power users and offer more than just multi-core SOCs: there’s also the 16-core Neural Engine, ProRes hardware accelerators (for high-end video editing), 16GB, 32GB or 64GB of unified memory, up to 8TB of high-speed storage, mini-led ProMotion displays, and a battery that can last the entire workday.
That’s just the hardware side: Because Apple wrote macOS Monterey specifically to take advantage of the custom hardware, the MacBook Pros deliver performance at unprecedented efficiency. And while there are still computers that are technically faster, there aren’t any that can match Apple’s performance-per-watt. These new Macs seem to defy logic and the age-old expectation that more power always means more heat and less battery life.
Intel couldn’t keep up, and so it’s been left behind. Now, it’s up to macOS developers to make sure they’re not left behind, too. They have to keep up, too.
Developers have work to do, too
Since Apple Silicon is an entirely different hardware architecture, existing applications need to be recompiled, at best, or rewritten, at worst, to fully take advantage of what Apple has delivered. Apple offers Rosetta 2, a compatibility wrapper that enables most x86 Mac applications to run seamlessly on Apple Silicon. Most end users won’t (or at least, shouldn’t) notice; their apps should just work as-is (and some will actually run faster on Apple Silicon, even with the Rosetta 2 translation layer). As developers bring their software fully in line with the M1 chips, their apps should see substantial performance gains.
There are limits to the Rosetta 2 compatibility. Not everything will run; virtual machines and apps designed around kernel extensions won’t function properly, or at all. That’s why software developers can’t rely on Rosetta 2 as anything more than a stop gap; it’s not a good idea to leave your users hanging for too long. Big players such as Adobe and Microsoft are already making the transition to Apple Silicon; many others have pledged support, and the stragglers will get there eventually — or they’ll be replaced by alternatives. Given the speed with which Apple is innovating on the hardware side, I wouldn’t wait long if I were a developer.
Apple learns some lessons
The last time Apple released a new notebook, I remember being disappointed as both a longtime Apple user and as a Mac admin. I wanted to want one, but I didn’t. The updates Apple offered in the previous generation of MacBook pros didn’t fit my needs. I was never a fan of the Touch Bar technology, thought the butterfly keyboard serviceable, didn’t like losing the MagSafe connector, and really didn’t like that Apple eliminated all of the port options that put Pro in MacBook Pro. While that MacBook Pro line sold well, the complaints persisted. That’s why all of the changes in the new models are so welcome.
It’s why, unlike the last model, I very much want one of these. And I keep wondering: if Apple can get this much performance out of the MacBook Pro, what’s the desktop Mac Pro going to be like?
Get busy, admins
Mac admins also have to keep up, too: these new laptops mean MDM solutions and business-critical apps need to be tested to make sure they play well with macOS Monterey. And the arrival of the M1 Pro and M1 Max models means admins have another set of hardware to test compatibility for. While any Mac admin worth his or her salt should have been testing for Monterey compatibility since WWDC, the process of ensuring Apple Silicon hardware is compatible with existing deployments can no longer be ignored.
Reminder: it’s no longer possible to purchase a MacBook Pro with an Intel chip.
And while Intel Macs will be supported for many years to come, Apple Silicon is here, and it’s the future — ready or not.
No change is without pain, and while Apple’s transition to a new chip architecture will cause issues in some production environments, these are good problems to have. The hard part for chipmakers, hardware rivals, developers and Mac admins will be keeping up with Apple now that it is, indeed, unleashed.
Apple arguably jumped inside the rapidly evolving Apple device management space when it introduced Apple Business Essentials this week. But how do people in the industry feel about the company’s debut?
Jamf CEO welcomes the opportunity
“When Apple innovates, Jamf celebrates,” Jamf CEO, Dean Hager said, on learning about Apple Business Essentials. “We believe this expected announcement is good news and presents Jamf with a terrific opportunity.”
Milanesi thinks Apple’s entrance into the market may be a problem for Apple MDM vendors such as Jamf, but sees opportunities for them to enhance Apple’s basic offer in other ways. That’s also what Hager thinks.
Jamf, which announced an impressive set of Q3 results Nov. 11, has always existed alongside Apple. Hager noted several times during the last decade when industry watchers thought Apple moves might damage his business: Once when Apple introduced MDM in 2010, again in 2011 with Profile Manager, later with Apple Configurator, and more recently with Apple Business Manager.
Bridge the gap
Hager argues that in each of those cases Jamf’s business grew as it worked to bridge the gap between what Apple offers and the sometimes more specialized needs of enterprise customers.
Speaking during the fiscal call, Hager shared some information concerning larger clients, some of whom moved to Macs on the strength of Apple’s M1 chips. These examples also included large-scale deployments, such as 100,000 devices in use in the airline industry and the iPads used during the recent SpaceX space flight.
It is arguable that these deployments represent more specialized requirements that become typical in businesses once they grow beyond a certain level and need more complex solutions than Apple, at least presently, provides.
Hager also thinks his company’s growing portfolio of security and education-focused products gives it extra ammunition to help businesses using Apple products. Jamf has also built market-tested solutions for zero-touch deployment, support for Microsoft Azure, and more.
It wasn’t a surprise
Apple’s move wasn’t a huge surprise. The company had been expected to introduce something of this nature since it acquired smaller MDM provider Fleetsmith in 2020.
Apple had to improve its business management offer, argued Hager. Business users needed an entry-level tool, and Apple needed a more equal footing with other solutions aimed at businesses of that size.
The company’s existing Apple Business Manager can be seen as a little too complex for small businesses, he said. Apple Business Essentials will make it easier, which should help further accelerate SMB adoption of Macs, iPhones, iPads, and Apple TV.
Apple’s move also gives it a more equal footing in contrast to Surface and Chromebook when it comes to remote wipe of business data. Hager cited (but did not share) first-hand Jamf data that shows some small businesses resist moving to Apple systems because of challenges of that kind.
“These problems needed to be solved,” he said. “This is going to raise Apple’s profile in business. The weakest spot for Apple in business has always been for the small businesses who just want to get started.”
Addigy also sees opportunity
Addigy CEO Jason Dettbarn also seems positive about Apple’s move. “This announcement demonstrates Apple’s commitment to Apple at work and heavy investment in the robustness of MDM protocol for Apple MDM vendors like Addigy,” he said. Business Esssentials “provides a great jumping off point for customers to adopt Apple” and then move to more sophisticated systems as they need them.
Apple is arguably striking the market at a pivotal moment.
Leading from below
The move to hybrid working has put employee choice even higher up an agenda in which most new employees now prefer Macs. This has driven big investment in Macs for business. IDC claims that in Q3 Apple shipped more Macs than in any quarter in history with a growth rate double that of the industry. This is a sustained pattern, making Mac the fastest-growing computer over seven quarters, a growth rate approximately twice the industry.
Moving forward, nothing has changed, said Hager.
“We are going to fill the gap between what Apple builds and the enterprise requires,” he said. “We see Apple Business Essentials customers as a new market of new small business coming up, and we will ensure our additional products sell well into that base and add extra value. We’ll give them a path forward into more robust and scalable solutions.”
In the event Apple’s moves help generate continued growth for its platforms in enterprise markets, then business users now enjoy a feast of integration providers capable of supporting that migration, with Apple supporting the smallest operations, and bigger partners such as Jamf, Addigy or SAP, helping to support platforms that had almost zero enterprise market share 10 years ago.
Things have changed since 2010.
“I don’t believe you can be a credible provider of enterprise software if you’re not part of the Apple ecosystem today,” says Jeetu Patel, general manager and executive vice president, Cisco security and collaboration.
Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.
All right, Pixel pals: We’ve talked about plenty of buried treasures you can dig up on your oh-so-Googley phone, thanks mostly to Android 12’s arrival. But there’s one fancy new feature you can make even more useful with a quick bit of crafty customization.
I’m talkin’ about the Quick Tap system introduced on the Pixel 6 and Pixel 6 Pro last month and also now available on the Pixel 5 and Pixel 4a 5G.
Quick Tap, in case you haven’t yet discovered it, is a splendid new shortcut system connected to physical presses of your favorite phalange. Once you set it up, you can simply tap twice on the back of your Pixel’s body to trigger a specific action on the phone.
Nifty, no? I sure think so. It’s a smart time-saver and a fantastic way to create your own fast-access shortcut to whatever function you want. The problem is just that the list of available options is annoyingly limited as of now, and what’d be the most practical and logical shortcut for that setup — especially for those of us with the new Pixel 6 models — is missing in action.
As with most things on Android, though, where there’s a will, there’s a way. And I’ve come up with a super-simple method for enhancing the Pixel’s Quick Tap feature and making it do the one obvious, impossibly helpful thing it won’t do now.
[Psst: There’s lots more Pixel magic where this came from. Check out my free Pixel Academy e-course to uncover tons of advanced intelligence lurking within your current Pixel phone!]
Read on, and I’ll explain.
The Pixel Quick Tap limitation
We’ll get right to it: The missing function I’m fussing about is the plain and simple ability to silence your phone within the Pixel’s new tappity-tapping shortcut system. It’d be a welcome option for Quick Tap on any Pixel, really, but there’s a reason it’s particularly pertinent for the Pixel 6 and Pixel 6 Pro.
The newest Pixel devices, y’see, no longer have the handy mute shortcut every Pixel before ’em has enjoyed. On Pixels past, a quick press of the power and volume-up buttons would silence your phone in a flash. It’s always been the easiest and most convenient way to shoosh your phone at a second’s notice, without having to futz around with any menus or on-screen options.
On the Pixel 6 and Pixel 6 Pro, though, Google eliminated that option. The reason, it seems, is that it conflicts with the awkward new setup in which the devices’ power button now pulls up Google Assistant by default. Because of that, the power and volume-up combo has become the way to summon the standard power menu that actually lets you shut down or restart your device.
It’s a bit of a mess, frankly, and more than a little silly. I mean, c’mon: It’s called the power button — not the Assistant button! But it’s also yet another indication of how hard Google’s working to put Assistant anywhere and everywhere and make it as frictionless as possible for you to use.
(On the Pixel 5 and Pixel 4a 5G, the same setup is also available — but unlike on the newer Pixel 6 models, if you opt not to have your power button pull up Assistant, the old power-volume-up muting shortcut still works. On the Pixel 6 and 6 Pro, it just isn’t present at all, no matter what you do.)
Now, for the fix.
The Pixel Quick Tap expansion
By default, the new Pixel Quick Tap system can handle six different functions:
Why silencing your phone isn’t part of that list is beyond me. But with about 20 seconds of tweaking, good golly, we can change that.
The trick ties into that final Pixel Quick Tap option — the one to open an app on your phone. And it revolves around a basic-as-can-be and completely free app from the Play Store called, rather appropriately, Mute.
After much searching and experimenting, I stumbled onto this random little gem and realized it was the perfect answer to our pressing problem. It doesn’t look like much, and it hasn’t even been updated in almost five years. But don’t let that throw you. For our current purposes, it is exactly what we need.
All you’ve gotta do is install the app and then open it once. You’ll be prompted to allow Mute to modify your system settings — a permission it obviously needs if it’s gonna be able to silence your phone for you. The app doesn’t require any other permissions or manner of access, and it can’t even access the internet. All it does, quite literally, is silence your phone whenever it’s activated.
See where we’re going with this? The next step is to open up your Pixel’s Quick Tap settings — by heading into the System section of your Pixel’s settings, then tapping “Gestures” followed by “Quick Tap” (again, available only on the Pixel 6, Pixel 6 Pro, Pixel 5, and Pixel 4a 5G).
Make sure the toggle at the top of the screen is on and active. Next, select the “Open app” option at the very bottom of the list, then tap the gear-shaped settings icon alongside it and select “Mute” from the list of choices that comes up.
Head back to your home screen, and that’s it: Now, anytime you double-tap the back of your Pixel, you’ll feel a brief vibration and see a small visual confirmation that Mute has been activated. Your phone’s media and ringer volume will shoot all the way down to zero, and Android’s Do Not Disturb mode will be activated. Double-tap again, and Do Not Disturb will go back off while your volume settings pop back up to their previous levels.
Doesn’t get much easier than that.
Don’t let yourself miss an ounce of Pixel magic. Sign up for my free Pixel Academy e-course and discover tons of hidden features and time-saving tricks for your favorite Pixel phone.
By rolling out some Android 12 features exclusively to Pixel users, Google can separate its own devices from the rest of the Android pack. Plus, Android 12 and the new Pixel 6 both purport to have privacy-centric changes. Computerworld Managing Editor Val Potter and Contributing Editor JR Raphael join Juliet to discuss Android 12 and how it performs on the new Pixel 6 and Pixel 6 Pro.
Apple’s new MacBook pro laptops were unveiled just three weeks ago, and have been in users’ hands for only two. Having spent a week using one of the 16-in. versions, I can say it represents a huge leap for Apple’s computer platform by tying together all of the elements of the company’s computing vision.
How the MacBook Pro performs: the TL;DR
Performance data confirms Apple’s launch claims; it’s fast.
Battery life and performance mean you can achieve much more with these Macs.
You effectively end up with a reference monitor in your backpack.
You’ll hardly ever hear the fan; these Macs run cool.
Desktop performance on the go that’s as effective in the office as in the field.
An overall triumph of design and execution, from the processor to the OS.
Apple’s move to M-series processors raises the status of its entire Mac fleet.
I’ve been working with the mid-range 16-in. model equipped with an M1 Pro chip that has a 10-core CPU, 16 GPU cores, and 16GB of unified memory. It costs $2,699 (and is available to the same specifications in a 14-in. model for $2,499).
Cast your mind back to the late 2019 MacBook Pro with an Intel Core i9-9980HK chip; it yielded Geekbench scores of 1,087 (single-core) and 6,823 (multi-core). Then recall the M1-based 13-in. MacBook Pro from last year; it achieved 1,706 (single-core) and 7,385 (multi-core) scores. (The M1-based MacBook Air hit similar numbers.)
I couldn’t quite believe the data I generated with Geekbench testing on this MacBook Pro: On average across five tests, the new Mac hit 1,755 (single-core) and 12,547 (multi-core). That’s as good as a late 2017 iMac Pro or an entry-level late 2019 Mac Pro — in a system you can carry under one arm.
This performance boost reflects how Apple configured the cores on these chips. As part of Apple’s processor evolution, it turned an additional two cores on these systems into high performance cores. That move is reflected in these results.
You’ll get work done faster
What that performance means is significant. Put simply, if your work involves using computers at peak performance to get development, design, video, or scientific research calculations done, these new Macs will help you get work done faster. On an individual basis, that’s significant, but if your company runs fleets of machines, you may well see significant cost savings and productivity increases over time if you deploy these.
At least one developer said his company will see these Macs pay for themselves in terms of productivity benefits within three months of deployment. Many developers are deploying these Macs because they are significantly more capable than existing laptops. I’ve also seen reports that claim significant application speed increases on these Macs. (Developers can also now use Apple’s TestFlight to distribute application betas to Macs.)
Apple and others have published numerous statistics to explain the performance advantages.
If you work in post-production or video, you’ll benefit from 1.7x the rendering speed when working in 8K video.
Software developers migrating to these Macs are seeing Xcode compile and project building speeds double.
3D artists claim 2.5x faster rendering using Redshift.
There are other statistics available, but what these three have in common is that each represents professional tasks in which the speed of the Mac makes a real difference in how quickly a project moves forward. These machines reduce the inherent productivity constraints put in place by the time it takes your computer to accomplish tasks.
Shhhh. Quietly goes the work
I still haven’t heard the MacBook Pro’s fan. To push it, I’ve created 500 loops (basically the same loop copied over and over) of audio in Logic, tried out some video editing and transitions in Final Cut X, and pretended to compile sample code in Xcode. I’ve used email, opened multiple Safari windows, and left all the creative apps running in the background. I even considered running Chrome, but since that browser probably hasn’t yet been optimized to play nice on these Macs, I held off. I tried all these things together along with downloading a movie, watching “Foundation” and making a Facetime call to my partner. (She was in the room next door and told me to stop being weird.)
No fan noise.
I am certain it is possible to make these fans start up. I’ve certainly read about it. But I couldn’t make it happen myself. I even opened System Preferences > Battery and enabled High Power Mode.
I’m guessing if the late CEO Steve Jobs is watching from the afterlife, he’ll feel pretty good, given he always aspired to fanless Macs; in this configuration, Apple is providing close to that even while doing computationally demanding tasks.
This is also reflected in energy consumption. I was pleasantly surprised by how well the 99.6-watt-hour-lithium-ion battery held up. What I’ve learned suggests that if you are a video professional and you take one of these Macs outdoors to work on a project, your computer will probably hold up for a day. Apple claims 21 hours of battery life. Again, that will change depending on what you are doing. Photographer Austin Mann describes the battery life as putting these Macs on “a radically different planet.” He believes it’s so good that pro users like himself will begin to use their Macs to do tasks in the field they would have avoided unless connected to power. Not only will the tasks you already do on battery power get done faster, but you’ll begin to do more tasks because the battery lasts longer.
While previous MacBook Pros delivered better performance when connected to power, there is no performance difference in these models when running on the battery alone. Apple published a series of charts to show you can expect maximum performance for less battery life, and says you’ll match the GPU/CPU performance of any 8-core PC chip while using 70% less power on an M1 Pro Mac.
Think about that.
Anyone running a fleet of PCs could see significant reductions in their energy expenditure after a move to Apple Silicon. Larger companies may also unleash measurable reduction in carbon emissions. I have spent time attempting to clarify just how much energy the Mac consumes to estimate the impact of that but haven’t found the data yet. The reason this matters is that the M1 Mac mini is significantly cheaper to run than preceding Macs. I see no reason to think this will not be the case here, though energy consumption does reflect use. Apple will likely publish energy consumption data soon.
A reference display in my backpack
Have you ever watched a movie at a 90-degree angle to the screen on your Mac? Now you can. The 254ppi XDR display on these Macs is phenomenal. Essentially, it’s a reference monitor you can carry with you. This is a great experience for any user, but if you use your Mac for color correction, film, design, photography, or other creative tasks, it’s going to vastly improve your working life with its color accuracy. Those 7.7 million pixels add spark to anything you see on screen.
You might not notice the 120Hz refresh rate, not because ProMotion doesn’t add anything to the experience but because you won’t be aware it is there. What I saw is that items on the display seem smooth when you scroll through them, images are finely detailed, and I enjoyed deeper blacks, brighter whites and colors that bounced off the screen. The Liquid Retina XDR display is rated at 1,600 nits, which is bright, and offers a million-to-one contrast ratio. This is a nice to have for most people, but is a thing of brilliance and wonder to creative professionals working with fully optimized content — particularly video, architecture, or medical imaging professionals. (Your spreadsheets will also look better.)
The display also has built-in reference modes that make grading HDR content seamless.
Notch a problem?
I’m completely unconcerned about the notch at the top of the screen on these machines. While I appreciate that a very small number of the professional users who need one of these Macs may use multiple Menubar icons, I don’t believe many will run into a real problem. I also appreciate that some applications may not yet be optimized to handle the notch, but the ones pros use probably soon will be, and Apple has a setting to help manage that. In my opinion, the fact that the notch has become the primary criticism of these new Macs shows us what a supremely limited palette of criticism exists.
Of course, the notch is only one aspect of these Macs. We also get the welcome return of a full-sized, proper keyboard that seems much more substantial than the butterfly keyboard it replaces. While I liked the Touch Bar, I had missed keys that work reliably much more. This swap — and the mood-matching backlit keys — make the MacBook Pro a more reliable system. That we at last have a 1080p FaceTime camera and better Wi-Fi connectivity just sweetens the deal.
(Apple has also created some machine vision intelligence algorithms that mean you get a better image in low light conditions with less graininess and more detail in the shot.)
Some may recall the praise heaped upon the built-in mic and speaker system in the 2019 MacBook Pro. All of this remains in the new edition and has been improved with more bass output across a wider range of frequencies, with clearer sounding audio in the mid- and high range. That means this Mac delivers real “oomph” for audio playback, supports spatial audio with Dolby Atmos, and has a three-mic array that’s sufficiently sensitive to pick up even quiet voices. If you make podcasts, this portable production studio will see you through.
One note: It weighs a substantial 2.1kg, slightly heavier and thicker than the model it replaced.
Connecting it up
MagSafe is another welcome improvement, if you consider not tearing your Mac apart when someone inevitably trips over the power cable a good thing. For me, MagSafe remains one of Apple’s most useful notebook innovations. I can still recall the horror I felt when I accidentally ripped the power input from my clamshell orange iBook, which then tumbled tragically to the stone kitchen floor.
You can still charge up the laptop using one of the standard USB-C ports. Given that you can charge the battery up to 50% in 30 minutes using the included MagSafe charger, though, why would you want to use USB-C?
The I/O on offer is improved. The SD card slot is back, along with three Thunderbolt 4 (USB-C) ports, an HDMI port, and a headphone jack — along with the ability to run two external displays at up to 6K. I imagine pro video editors will maintain a hub connected to a host of peripherals at their desk and simply plug their MacBook in when not in the field.
For me, at least, it’s as simple as one cable connected to my powered Elgato hub. While your experience may vary, it’s hard to ignore that what’s happening here is that these Macs effectively give you the equivalent of an entry-level Mac Pro you can tuck under your arm.
No wheels are required.
In praise of Apple’s whole widget strategy
Apple acquired PA Semi in 2008. When it did so, the acquisition likely reflected a strategic decision the company had already made, possibly even before the introduction of the first iPhone.
That decision was almost certainly inspired by the poor results of the PowerPC processor plan, which stymied Apple’s efforts for years. The move also hints that the use of Intel chips in Macs was — on some level, at least — seen as a temporary stop-gap before Apple could build its own processors.
Throughout the years Apple worked with other people’s chips, the company sought to optimize application and system performance via the operating system, some of the software, and by designing the hardware. While limited by what its chosen third-party hardware components could achieve, Apple worked closely with some vendors to find optimizations and produced a system that still impressed despite those hardware compromises.
Apple doesn’t have to make the same compromises any longer. It can now design the operating system, the hardware, the processor, and some of the software, bringing that work together to realize a fully optimized experience on all its products, including Macs.
We’re only really at the beginning of the Mac processor part of this new journey. But as we reach a moment when performance enhancements on any platform depend on on-chip optimizations and the kind of software and hardware design decisions Apple has already been making, Apple is well set for the future.
With TSMC, it seems likely that the first mass-market PCs in the industry to run on 3-nanometer chips will have an Apple logo. But as we move toward 1nm chips it’s going to become more cost-efficient to optimize how chips work rather than the process used to make them.
It is in that context I believe Apple’s brand-new MacBook Pro combines all the benefits of the company’s many years of strategic innovation. It’s a remarkable testament to the company’s determined approach, though doesn’t explain the years during which Apple seemed to show little interest in Macs.
Must or miss?
If you need this kind of power (or the expansive 16-in. display), the MacBook Pro is a definite must. This model combines all that made its predecessor great, addresses all the criticisms people had, and underpins everything with a chip that performs like no other. And yes, next year’s model will be better again, but this model is already better than what was once seen as best.
While I would like for the Apple logo on the lid to light up, I consider this MacBook Pro to be a triumph at every level, from the years of work on the internal processor to the OS and display. Such compromises as do exist (like that notch), really only serve to show how far Apple has knocked this particular ball out the ball park.
Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.
Fancy new features are fan-frickin’-tastic. But let’s face it: We aren’t all carrying Google’s shiny new Pixel phones. And we don’t all have Android 12 in front of our shiny faces just yet.
With all of that in mind, I thought now would be a fine time to turn our attention to some of Android’s many buried treasures — phenomenal time-saving and productivity-boosting possibilities built right into the software on our existing phones, no matter who made ’em or how old they may be (within the realm of reason, anyway; if you’re still totin’ around a phone with Froyo, sorry pal, but you’re on your own).
Specifically, I want to think our way through some incredibly useful advanced features connected to Google Assistant — the friendly if sometimes slightly sassy virtual companion that’s always standing by and ready to lend a helping hand (and/or voice).
The best part about these Assistant-associated gems is that they’re every bit as beneficial with a three-year-old LG jalopy as they are with a high-end 2021 flagship. But since Assistant commands are inherently invisible, they’re all too easy to overlook or forget.
So without further ado, I give you 11 advanced Google Assistant commands you should really remember to use on Android — no matter what Android phone you’re carrying or which Android version it’s running.
[Want even more advanced Android knowledge? Check out my free Android Shortcut Supercourse to learn tons of time-saving tricks for your phone.]
Google Assistant Android trick No. 1: The voice in your head
This first Google Assistant trick for Android is one of my all-time favorites. It’s one of Assistant’s most practical and impressive powers — and yet, hardly anyone seems to realize it’s available.
Once you know it’s there, though, good golly, will you be giddy. So here it is: Anytime you’re viewing an article within Chrome, the Google app, or the Google News app on your Android phone, you can summon Assistant — by saying “Hey Google” or using whatever method you like — and then say one of the following commands:
Read this page
Or Read this please (if you want to be extra-polite)
However you phrase it, Assistant will merrily oblige and start reading the article on the screen out loud to you. It’s an awesome way to catch up on Very Interesting Content™ whilst driving, walking, horsey chasing, or whatever it is you get up to.
And the experience is actually quite lovely: Just like if you were listening to a traditional podcast, you can skip around in the audio and adjust the playback speed via a bar at the bottom of your screen. You can even control the playback with a panel that’ll pop up in your notification panel anytime the audio is playing.
The only requirements are that you’re running 2014 Android 5.0 release or higher (and if you aren’t, we’ve got bigger fish to fry) and that you’ve got Assistant’s language set to English.
Google Assistant Android trick No. 2: Power status
Wondering how your phone’s battery is doing toward the end of a long day? Maybe one filled with much horsey chasing? No problem: Just ask your Android-dwelling Assistant What’s my battery level, and you’ll have the answer in half a hoofbeat.
What makes this command especially useful is the fact that it’ll also work from an Assistant-connected speaker or Smart Display, so even if your phone isn’t right there with you, you can still ask Google how much power it has left.
In that instance, just be sure to add an extra bit of specificity into your command, with What’s my phone’s battery level as your spoken request. Assistant will rattle back the exact remaining percentage, no matter where your phone may physically be.
Google Assistant Android trick No. 3: The instant phone finder
Next, let’s step back a bit from there and get to an even more basic but still oft-relevant request: When you can’t find your phone at all, remember the Google Assistant command Where is my phone as your one-stop phone-finding solution.
This one works on a couple different levels:
If you have multiple Android devices connected to the same Google account — say, a work phone and a personal phone — you can give this command to any one of those gizmos and have it return results for the others.
No matter how many Android devices you’ve got on your account, you can give the command to an Assistant-connected speaker or Smart Display or even a current Chromebook, so long as you’re signed into the same Google account on that device.
Whichever way you go, Assistant will offer up the missing phone’s last known location and also offer to ring it for you.
Google Assistant Android trick No. 4: The swift shusher
Another useful option from the phone control file is Assistant’s ability to silence your device on demand whenever the need arises.
It may seem obvious, but it’s easy to forget: A quick uttering of Silence my phone will cause Assistant to turn your device’s garsh-dang volume all the way down to avoid any interruptions.
And just like our last two commands, this one will work on your actual Android device or on any Assistant-connected speaker, Smart Display, or Chromebook where you’re signed into the same account.
Just remember to say thanks when you’re done.
Google Assistant Android trick No. 5: Location dictation
All right — we’ve covered what to do if you can’t find your phone. But what if you have your phone and yet still don’t know where you are?
First things first, try cutting back on the peyote, pal. And if that doesn’t work, say Where am I to your friendly neighborhood Assistant. That’ll cause it to pop up a map showing you your own current location at that very moment.
On a related note, you can ask Assistant to Share my location to pull up a simple interface for sending your current whereabouts to anyone you need — a co-worker who’s supposed to meet you at a client’s office, an jolly ol’ chap who can’t remember the restaurant you agreed upon for lunch, or even a horsey you were chasing who got away and wants to re-engage you in the timeless art of recreational gallop.
Google Assistant Android trick No. 6: Your screenshot genie
Android allows you to snag screenshots with a variety of mechanisms, both on-screen and physical-button-based, but those commands aren’t always convenient. Sometimes, you need a screenshot saved when you don’t have a hand and/or hoof free for one reason or another.
In those moments, simply shout out to your Android Assistant and then say these magic words: Take screenshot. And that’s it: Assistant will save an image of whatever’s on your screen at that moment — no tapping, button-pressing, or frustrated fussing required.
Google Assistant Android trick No. 7: Snappy photo snapping
Taking a picture typically takes both patience and fingers — at least, in the traditional ways of handling that task. When you need to snap a photo on your phone without all the usual requirements, keep some of these Assistant possibilities at the front of your bustling brain:
Take a picture
Take a selfie
Take a video
And for the real Assistant-flexing power, add in 10 seconds (or whatever amount of time you need) to the end of any of those commands. That’ll cause Assistant to start a countdown timer so you’ll have time to pose your squared-shaped head just right and start smiling all purty.
Google Assistant Android trick No. 8: The sanity-saving noise-maker
Whether you’re back in the office and trying to adjust to all the workplace racket or working from home and trying to tune out all of that racket, Assistant has an excellent option to help you focus.
Let’s all say it together now: Play white noise.
And if that ever isn’t cutting it for you, you’ve got a bunch of other variations just waiting to be embraced:
Play relaxing sounds
Play nature sounds
Play water sounds
Play running water sounds
Play babbling brook sounds
Play oscillating fan sounds
Play fireplace sounds
Play forest sounds
Play country night sounds
Play ocean sounds
Play rain sounds
Play river sounds
Play thunderstorm sounds
Play thrashing death metal sounds*
*Not actually available on Assistant yet, but maybe it should be
Google Assistant Android trick No. 9: The settings shortcut
Assistant’s got a lot of options, especially on your phone. But, rather ironically, getting to the Assistant settings on Android isn’t exactly intuitive. You’ve usually gotta go through either the Google app or the Google Home app and wade your way through multiple murky menus to find what you want.
Well, here’s the better way: Holler at Assistant and say Assistant settings. Bam. Problem solved, seconds saved.
Google Assistant Android trick No. 10: Auto-repeat
If Assistant ever says something you didn’t quite catch, try a variation of one of these little-known commands:
Can you repeat that more slowly?
Can you repeat that more loudly?
I didn’t hear you
You could also always just squawk out a loud “HUH?!” in your best old-fella-asking-you-to-speak-up impression — but odds are, Assistant won’t acknowledge it.
Google Assistant Android trick No. 11: The most useful command of all
Well, my dear, there’s bad news: We’ve nearly reached the end of this Assistant Android collection.
But wait! There’s good news, too: We’ve saved the best for last.
Assistant, y’see, is a lovely virtual creature. It can be quite helpful and often even pleasant. But it also has a habit of sometimes butting in and responding to something when you didn’t actually want its involvement.
The next time Assistant starts flappin’ its digital yap out of line, just remember these four magical words:
That wasn’t for you.
Easy, right? Now if only we could find a way for that to work on humans, too.
Get six full days of advanced Android knowledge with my new Android Shortcut Supercourse. You’ll learn tons of time-saving tricks for your phone!
Apple today took a major step into enterprise technology provision, unveiling a new service called Apple Business Essentials — adding yet another strong argument to support enterprise deployment of its products.
What is Apple Business Essentials?
The service combines an array of services for small and midsized companies within one Apple-friendly MDM management tool. It is aimed at businesses with up to 500 employees.
Apple Business Essentials is available now in beta and is set to launch for real in spring 2022. It provides tools including iCloud+ for Work, AppleCare, 24/7 Apple support, device and application management and automated setup using Collections and Smart Groups.
Apple said the service will be available in the U.S. initially, with prices ranging from $2.99 a month per user to $12.99 a month per user depending on number of devices and storage levels. (More info below.)
The company also introduced a new Apple Business Essentials app that employees can use to install apps assigned for work and to request support.
Apple Business Essentials promises easy setup, onboarding, backup, security, support, repairs and updates, and 24/7 on-call tech support.
With a nod to remote working, the Apple support component is flexible, which means if you are an employee working remotely from your home you will also be covered. And if a tech support visit is required, the engineer will be sent to your home within four hours. In other words, an SMB can outsource at least some of its tech support provision using the service.
In addition, the new Apple service can be used with existing MDM solutions, though the company thinks its service will be of more value to firms that are new mobile management options. The idea is that Apple Business Essentials makes it as easy to manage Apple devices as it is to use them.
Why is this significant?
Think back 10 years to the introduction of the original iPad, when Apple was a minnow in the enterprise sea. There may have a few creative professionals using Macs (and executives using iPhones), but the company’s enterprise position was limited.
That changed with the iPad, when C-class executives rushed to pick up this new tablet. As they did, attitudes about Bring Your Own Device policies also changed, with employees becoming empowered to use the tech they rely on at home at work.
[Also read: SAP VP Martin Lang touts Apple, M1 Macs and the mobile enterprise]
Today, when given a choice, most employees will choose an Apple product, with Macs experiencing a particular renaissance on the back of the superb M-series processors Apple has made.
Discussing the importance of Apple Business Essentials, Carolina Milanesi, president and principal analyst at Creative Strategies said:
“As a long time Apple ‘watcher,’ you know all too well Apple has been very shy in talking about how they address the enterprise needs. They have always preferred to be seen as providing to the user of the technology rather than providing to the IT manager. Since the launch of the M1, I have been arguing that Apple’s biggest opportunity to grow share in the PC market rests in the enterprise. The product has proven itself to be not only desirable, but capable of addressing the enterprise needs and moving from BYOD or Back door approach to IT deployed will allow Apple to grow share more quickly. I think this is why the announcement is timely and significant.”
The move reflects the extent to which Apple has staked space in the enterprise, and represents efforts by the company to expand rapidly in the sector.
Milanesi also said the move may challenge others already in the space, but said, “there is still opportunity in the market if companies want to differentiate by supporting Mac and Windows.”
The analyst also noted that PC-centric companies such as Dell, Lenovo, or HP that have been working to offer support to enterprise customers may be challenged, as they “always struggle to support Mac in the same way.”
The Apple message
In its statement introducing the new service, Apple’s vice president of enterprise and education marketing, Susan Prescott said:
“Small businesses are at the core of our economy, and we’re proud that Apple products play a role in helping these companies grow.
“Apple Business Essentials is designed to help streamline every step of employee device management within a small business — from setup, onboarding, and upgrading, to accessing fast service and prioritized support, all while keeping data backed up and secure, so companies can focus on running their business.”
What do companies get?
With everything managed in one console, SMBs get:
Device and user management;
iCloud for work storag;
AppleCare+ for Business Essentials (optional).
The console provides an instant view of monthly costs and can manage individual devices – so the Apple TV in a conference room can also be maintained.
What are Collections?
Apple has thought about what happens when you introduce a new employee to your organization. The Collections feature lets IT configure settings and apps for individual users, groups, or devices. You may want to install a specific app on all the hardware used by the accounting team, for example. Collections lets you define roles, assign permissions, and more. It also provides IT with an easy-to-read account of the current app licensing status, so admins can tell when they might need to revoke or acquire new licenses for corporate teams.
What device and user management features are included?
Apple Business Essentials also automates services and permissions, and iscan configure an employee’s personal device to safely, securely, and — above all, privately — carry work-related data. It does this through support for two Apple IDs on the device – the managed ID and the employee’s own. This is created during user enrollment, when cryptographic separation for work data is introduced. This helps ensure employee data remains private, while company data remains secure.
In use that means that when employees sign in to their corporate or personally owned device with their work credentials, Collections automatically push settings such as VPN configurations and Wi-Fi passwords. It will also install the Apple Business Essentials app, through which employees can download and install key business apps such as the recently improved Cisco Webex or Microsoft Word.
Apple also said the service empowers IT managers to enforce critical security settings such as FileVault for full-disk encryption on Mac, and Activation Lock to protect devices that may be lost or stolen.
What is iCloud for work?
I evangelized about the need for Apple to introduce iCloud for the enterprise a few weeks ago. The company has now done just that.
Apple Business Essentials provides a dedicated iCloud account for work that works just like any other iCloud deployment. Business data in iCloud is automatically stored and backed up, making it easy to move between devices or upgrade to a new device — and you can choose the amount of data to provide to employees.
Employees continue to retain separate access to their own iCloud data, enabling separation between personal and work-related use.
AppleCare+ for Business Essentials
We don’t yet know what this feature will cost, but Apple is also promising an optional extra in the form of AppleCare+ for Business Essentials. Companies that sign-up to this will be able to offer staff 24/7 phone support, get training for both IT admins and employees, and cover up to two device repairs per plan each year.
(The cool thing about the latter is that these can be applied across any user/device covered by an Apple Business Plan, so the cumulative impact is that even-accident prone employees should be supported.)
Employees can initiate repairs from within the app, and Apple-trained technicians will be onsite in as little as four hours to get devices back up and running. If your employee is remote, the technician will visit them where they are.
What’s it like to use?
I’ve only really tinkered with Apple Business Essentials at this stage, but it is quite clearly an Apple product. That means a clear, column-based user interface, a clean design that eliminates what is not necessary, and eye-guiding use of icons and categories to help you find what you need.
Collections, for example, are easy to create, populate and distribute, while more granular tasks, such as user, user groups, or device management are also easy to get to and explore.
But for many, the Settings section will be of particular use.
Apple appears to have put a lot of thought into these, bundling standard items such as passcode policy management and Wi-Fi settings in with increasingly important controls, such as Energy Saver Settings.
While it may not at first be clear why ensuring all devices follow company policy for energy consumption, if you stop to consider the energy costs of running systems for up to 500 employees ,the advantage of such controls becomes a lot clearer. And they should probably be seen as vitally important as we lurch into climate crisis.
What does it cost?
Apple Business Essentials is available as a free beta starting today in the US.
Once it emerges from beta, Apple said the service will cost $2.99 per user/month to handle a single device with 50GB storage, scaling up to $12.99/user/month for up to three devices per user and up to 2TB of storage.
Those prices don’t yet include the cost of AppleCare+ for Business Essentials, and the company is not charging beta users. But it didn’t want charges to surprise anyone once the service launches.
How do you join the beta program?
The service will be fully available in the spring of 2022. To join the beta signup here.
Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.
Hosts Juliet Beauchamp and Ken Mingis talk with guests about the latest tech trends and news.
By rolling out some Android 12 features exclusively to Pixel users, Google gave itself an opportunity to further differentiate its own devices from the rest of the Android pack. New updates, like the Material You interface, give Pixel owners the chance to redesign the look and feel of their phones. Plus, Android 12 and the newly launched Pixel 6 both purport to have privacy-centric changes. So, how does the Pixel Android 12 experience compare to the experience of Galaxy user? Computerworld managing editor Val Potter and contributing editor JR Raphael join Juliet to discuss new Android 12 features, including how it performs on the new Pixel 6 and Pixel 6 Pro versus other Android devices.
With COP26 just days away, Apple has joined a new chip technology program as it works to improve its environmental sustainability in product design. It’s a move that reflects what is hopefully a growing understanding that every enterprise must mitigate the environmental consequences of doing business.
An obsession with detail
The Sustainable Semiconductor Technologies and Systems (SSTS) scheme will take a deep look at how future processors are made in order to help reduce the environmental impact of making these chips. It’s a detail-based approach that may benefit from the critical successes Apple’s product designers can sometimes achieve. The program is being put together by Interuniversity Microelectronics Centre and aims to “anticipate the environmental impact of choices made at chip technology’s definition phase.”
What that means is the group hopes to develop models to help chip designers reduce the ecological footprint of the processors they create. It’s an overt attempt to align processor development with the fight against climate change.
From design to manufacturing
The challenge isn’t just ensuring that chips themselves are designed with environmental consequences in mind: it’s also about developing better manufacturing processes.
While it’s no secret that processors are small, they are becoming increasingly numerous. The number of devices that use processors is growing exponentially, which means the consequences of manufacturing them are great.
Processor manufacturing is characterized by high energy consumption and makes extensive use of chemicals, rare materials, and water. It also generates a huge amount of greenhouse gas.
Where Apple fits in
Apple’s chip manufacturer, TSMC, uses almost 5% of Taiwan’s entire electricity production. It used 63 million tons of water in 2019, and generated controversy during this year’s drought.
It’s not just Apple, of course: A single Intel factory in Arizona produced more than 8,000 tons of hazardous waste in just three months this year, The Guardian reports. In fact, some say the manufacturing of the processors used in our devices accounts for the majority of the carbon output from electronics devices, with The Guardian citing a Harvard study.
Those powerful M-series Apple chips may have turned Macs into the cream of the PC industry crop, but manufacturing them has consequences. We know Apple takes this stuff seriously, so it shouldn’t be a surprise the company has joined SSTS.
Doing so reflects Apple’s growing understanding that the environmental consequences of product manufacturing must be considered from the beginning of the design process. That’s why, if my sources are correct, (which I think they are), Apple’s environment-focused teams now have a big say in new product design.
They explore how designs can ensure that raw materials can be separated, recycled, and reused. They also work to identify where materials replacement could reduce the ecological footprint.
The decisions concerning new product design and manufacturing are also in line with the company’s long-term hope that it may create a circular manufacturing system that eliminates the need to consume further resources.
Towards a green new deal
Apple aims to be completely carbon neutral by 2030 across its supply chain and products. It has convinced 175 of its suppliers to transition to renewable energy, and continues to invest in projects and resources (such as managed forests and wind farms) to help meet these targets.
“Every company should be a part of the fight against climate change, and together with our suppliers and local communities, we’re demonstrating all of the opportunity and equity green innovation can bring,” said Apple CEO Tim Cook recently. “We’re acting with urgency, and we’re acting together. But time is not a renewable resource, and we must act quickly to invest in a greener and more equitable future.”
The Apple leader understands that building this green new deal is an opportunity.
“Climate action can be the foundation for a new era of innovative potential, job creation, and durable economic growth. With our commitment to carbon neutrality, we hope to be a ripple in the pond that creates a much larger change,” he said last year.
Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.
Google’s Android 12 software is packed with interesting treasures — but unless you’re using one of Google’s own Pixel phones, it’s still a ways off from actually landing in your hands.
The tortoise-like pace of most Android updates is another subject for another day (as is the tortoise named Rupert who I’m pretty sure is responsible — that slimy-shelled rascal). Today, I want to explore some creative solutions for bringing a small but significant smidgeon of Android 12’s goodness onto any device this minute.
This minute, you say? Why, yes, Mr. Giggles! With a touch of creativity and an optional pinch of platypus magic, you can experience a handful of select Android-12-inspired treats on any Android device, with any Android version — and with very little effort — right now. All you need is the right app and a few minutes of setup, and some scrumptious Android 12 flavors will be ready and waiting for your ingestion.
To be clear, these aren’t the biggest, most earth-shifting changes Android 12 has to offer. When it comes to elements like the Material You system theming engine or the engine-room-level Android 12 privacy enhancements, nothing short of the operating system update itself can deliver the goods. But outside of those foundational features, Android 12 has some delightful experience-enhancing delicacies — and those are the ones you can emulate most anywhere.
So grab the nearest bib, get some grape soda standing by and ready to wash all this deliciousness down, and let’s dig in.
[Want even more advanced Android knowledge? Check out my free Android Shortcut Supercourse to learn tons of time-saving tricks for your phone.]
Android 12 feature No. 1: The Privacy Dashboard
One of Android 12’s most prominent surface-level privacy features is a saucy little somethin’ called the Privacy Dashboard. It’s an expanded and enhanced control center for all things privacy on your phone, and it makes it easier than ever to monitor exactly how apps are accessing sensitive types of info and then to reclaim control as you see fit.
Without the aura of Android 12 on your grease-coated screen, a crafty app called — oh, yes — Privacy Dashboard can fill the void. Privacy Dashboard is modeled after its Android 12 namesake and almost identical to that native Android doppelgänger in every outward-facing way.
Not bad, right?
Privacy Dashboard is completely free to use. The app sports some optional in-app donations if you want to support the developer, but as of now, they don’t unlock any extra features or do anything other than make you feel warm, fuzzy, and like a person with roughly three dollars less in their bank account.
As for the related subject of privacy, the app does require a couple forms of advanced system-level access in order to do what it needs to do, but it’s all perfectly justifiable. The developer even explains what exactly the permissions are and why they’re needed on the app’s Play Store page and within the initial setup screens. The code is open sourced, too, so anyone can peek in and see exactly what it’s up to. And if you’re still worried about how the app might handle your sweet, sticky data, just look at its app info page within your system settings. You’ll see that it doesn’t even have the ability to access the internet, which means there’s no real way it could transfer information off of your device even if it wanted to (which it seems safe to say it doesn’t).
Continuing with our privacy-empowering theme, another Android 12 element you can grant yourself this second is the Apple-inspired system of visible indicators that pop up anytime an app is accessing your device’s microphone or camera. It’s hopefully something you’ll never need or be surprised to see, but it’s a smart bit of added privacy protection and a worthwhile pinch of extra awareness to have.
And guess what? The best app to handle the task is the same one we just went over. Yes, indeedly: The lovely Privacy Dashboard has you covered.
Just install the app, if you haven’t already, and then tap “App Settings” on its main screen. There, you’ll see the option to ensure the “Privacy Indicators” feature is active along with an “Indicator Customization” feature that goes a step beyond Google’s own Android 12 implementation.
How? Why, I’ll tell ya how, you curious cup of wonder: It allows you to (a) control exactly how each indicator looks and where it shows up on your screen and (b) enable an option to make those indicators tappable, so you can easily open up the associated permission log for more context anytime you see one of ’em.
What’s nice, too, is that by default, the indicators show actual icons that make it easy to see at a glance exactly which sensor is active at any given moment. It’s a much more intuitive interface than the interpretation-required dots you get with the native Android 12 equivalent.
See the camera icon, up there in the upper-right corner of the screen — way up above the beautiful bright beak of my executive assistant, Mr. Clucklesby? That’s it!
Oh, and in addition to the camera and microphone indicators, Privacy Dashboard also includes an on-screen indicator for location access, which Android 12 for some reason does not.
And if you really want to go all out, the app can even alert you with a notification anytime one of those sensors is actively in use.
Android 12 feature No. 3: Smarter auto-rotate
One of Android 12’s snazziest tricks (and one that appears to be a Pixel-exclusive element for now) is a new and improved auto-rotate system. The feature relies on your phone’s front-facing camera to watch which way your lopsided head is tilted at any given moment and then make sure your screen matches that orientation.
You can’t quite bring that full dog and pony show to any phone at this point, but you can give yourself something with a similar sort of purpose — and perhaps an even more reliable result. The answer resides within a spectacular app called, rather fittingly, Rotation Orientation Manager.
Though it may have overlooked the rare opportunity for a triple-rhyming title with Rotation Orientation Station, Rotation Orientation Manager more than makes up for its missed meter with the exceptional screen-enhancing intelligence it adds onto your phone.
Just open the app up, make sure to select “Auto-rotate off” as the general default on its main screen, then tap the three-line menu icon in the upper-right corner and select “Conditions.”
Tap the line labeled “Accessibility” and follow the prompts to allow the app access as an accessibility service — something that sounds scary but is genuinely required in order for it to be able to see what your phone is doing at any given moment and then adjust your rotation accordingly. (The developer goes through this and other permissions and exactly how they’re used on the app’s Play Store page.)
Then look through the options on that screen and the “Apps” tab alongside it. You can carefully control exactly how and when your phone’s screen will and won’t rotate itself to make it work any way you want.
Personally, I like keeping the rotation locked into portrait mode with all but a small and super-specific handful of apps — things like Camera, Photos, Maps, and YouTube, where I actually want to view the screen in its horizontal orientation some of the time. Outside of those titles, it’s pretty rare for me to want the screen to rotate, and Rotation Orientation Manager makes sure it never does.
Android 12 feature No. 4: Smarter search
Android 12 provides Pixel owners with the long-overdue ability to search the web and their phones in a single streamlined spot on the home screen. (Seriously, how funny is it that a Google-made operating and a Google-made phone didn’t have that manner of search capability up until now?!)
Here’s a little secret, though: You don’t need Android 12 or a Pixel to enjoy that extremely practical power.
An app called Sesame will add the same sort of supercharged search setup onto any Android phone — and just like with some of the other items on this friendly little list of ours, it’ll actually outdo Google’s implementation in some pretty significant ways.
Install Sesame on your phone and go through its setup. Next, slap its search widget onto your home screen, tap it lovingly, and start typing away. As you enter letters, Sesame will return results from your apps, your contacts, and even your calendar events. It indexes specific areas within apps, too, so you could start typing the first couple letters of a Slack team, for instance, and it’d pop up a direct link to take you right to that part of Slack.
And, of course, it’s just one more tap from those results to move into a more traditional web search.
Sesame costs $3.50 after a free two-week trial.
Android 12 feature No. 5: One-handed action
Last but not least in our Android 12 adoption collection is a little somethin’-somethin’ for anyone with a plus-sized screen and a normal-sized hand.
Android 12 at long last introduces a native one-handed mode for Android that lets you shrink the active area of your screen down into a smaller window for more ergonomic access when you need it. But once more, you don’t actually need Android 12 to get that kind of capability.
First and foremost, lots of Android device-makers have been offering their own one-handed optimization systems in their versions of Android for a while now. On Samsung devices, for instance, try searching your system settings for the phrase one hand. That’ll pull up the switch along with instructions to activate a one-handed mode that’s already on your phone and almost shockingly similar to what Google did in Android 12.
If your device doesn’t have such a system built in, though, fret not — and instead go grab an app called Smart Cursor. Then congratulate yourself for having such a shrewd pointing program on your virtual companion.
Smart Cursor takes a decidedly different approach to optimizing a large Android phone for one-handed use, and even if your device does have a built-in one-handed mode feature, you might actually find it to be the superior choice. What it does is give you a way to pull up a floating cursor at the top of your screen that can then access any areas you can’t reach and interact with ’em on your behalf — kinda like one of those claw-reacher thingamajiggers you sometimes see in the real world.
By default, you simply swipe inward from one of the lower areas of your phone’s edges to pull up the cursor. Then you move your finger around to position the cursor, and you tap anywhere on your screen when you want the cursor to click.
Smart Cursor has tons of options for customizing exactly how all of that works. You can shift around the trigger zone, change the size and appearance of the cursor itself, and adjust how long the cursor stays visible after it’s been activated. All in all, it’s a really thoughtful way to extend your reach without resorting to the screen-squashing silliness most native one-handed modes (including Android 12’s) require.
Smart Cursor is free in its base form with an optional $2 upgrade that unlocks some extra advanced options. The app does require the same system accessibility service permission we’ve talked about a couple times now — again, inevitably, given what it needs to do — but it doesn’t have any other alarming permissions or even access to the internet, so there’s no need to worry your hilariously horse-like head.
Android 12 may be out of your grasp for now, my striking steed, but with clever creations like these, you can bring some splendid new features onto your phone while you wait — and feel at least a little less left out than you did before.
Get six full days of advanced Android knowledge with my new Android Shortcut Supercourse. You’ll learn tons of time-saving tricks for your phone — no new software required!
Hexnode, a cross-platform unified endpoint management firm, is recognized by both Forrester and Gartner as an enterprise mobility solutions provider that since 2013 has worked with business clients to lock things down. I recently spent a little virtual time with Hexnode CEO Apu Pavithran to talk Apple in the enterprise and the future of work.
Pavithran recounted how much change enterprise IT has seen in the past decade, and where things may be going now with Apple as more of enterprise player.
A decade of change
Think back to 2010Mobile hardware in the workplace was heavy on BlackBerry, ThinkPads, and a smattering of mobile devices. Apple’s iPhone was certainly the consumer smartphone of choice, but the BYOD wave hadn’t yet hit business.
As consumers quickly embraced smartphones in their day-to-day life, they also began insisting on using them at work. “This movement paved way for Apple’s enterprise evolution,” said Pavithran.
Since then, Apple has paid increasing attention to the needs of enterprise IT. Apple Push Notification Services, Apple VP, Apple Business Manager, the Fleetsmith acquisition, and critical partnerships with the likes of SAP, IBM, Jamf, Deloitte, and Cisco.
Those efforts have paid off. “Previously trying to manage applications, titles, device settings, program licensing, and federated AD logins were nightmares for the IT department. The introduction of these services, and especially the Apple business manager (ABM), made things easier for IT admins,” said Pavithran.
“Mac adoption in the enterprise saw tremendous growth and iOS devices slowly became the mobile industry tool of choice.”
His comments echo those of Cisco’s general manager and executive vice president Jeetu Patel, who told me recently: “I don’t believe you can be a credible provider of enterprise software if you’re not part of the Apple ecosystem today,”
On the competition
The Hexnode CEO doesn’t think Apple is ahead on everything. He points to some recent Android enhancements as being advantageous, but notes that Cupertino is only slightly behind.
“Apple’s hardware and software are spot on and it does deliver in performance, security, and [reducing] the attack surface within an enterprise. They also do meet the checklist when it comes to managing corporate-owned devices, COBO or Kiosks for that matter. But for a full-fledged BYOD, User Enrolment still falls short of Google’s Work Profile.”
Microsoft, of course, remains in the game, too. From a multi-platform perspective, Pavithran sees the company’s move to offer Windows as a subscription represents as noteworthy, though it is limited by price.
“From a business standpoint ,paying $31/user/month for a basic dual-core remote PC when you can get a sizable business laptop with pre-installed Windows on a lease at costs less than $10/month — the math doesn’t make sense,” he said.
On employee choice
As the Great Resignation intensifies, businesses are transforming the workplace to tempt and retain good staffers. Hybrid work is rapidly becoming a must-have option, and employee choice remains critical. “When employees are comfortable with the system, then their work gets easier,” Pavithran said.
As a result, it makes sense for IT to “automate everything that can be automated so people can spend time on tasks that actually matter.”
Reflecting Apple’s argument around consumer satisfaction and brand appeal, he said: “[Employees] are happier when you provide them with a Mac or an iPhone. One recent U.S. survey showed that 71% of people prefer Mac over any other for their work PC. It’s true that Apple devices cost a bit more initially, but in three to five years they offer better value for money than a regular PC as they offer regular OS and security updates. Bundled with AppleCare for maintenance and pretty good buyback offers, it is lighter on the wallet.”
Incoming technologies and emerging challenges
Apple and Facebook/Meta will be battling in augmented reality in the next few years. Is there a real enterprise opportunity to be had there? Hexnode’s boss thinks so. “AR in the enterprise could be absolutely amazing in the next five years,” he said.
Retailers are already using the tech, though Pavithran voices some concerns – which will probably be echoed by every enterprise: “Although on AR glasses, It is indeed fun and there is a ton of functionality, …I am still not sure how much privacy and data security they will offer.”
Meanwhile, the downside of the move to hybrid workplaces includes the rapid proliferation of threats to security. “Phishing/ransomware incidents are rising at an alarming rate. If we look at cybersecurity news on a regular basis, there is at least one major cyberattack each month,” he said.
In the remote/hybrid age, with people working in living rooms or from anywhere they please, the risk becomes more serious. “Enterprise security often gets overlooked and it is crucial for businesses to take necessary measures to safeguard their data.”
How to secure your endpoints
Given Hexnode’s business is around endpoint security and management, it’s not a big surprise its CEO evangelizes solutions of that kind. But the biggest challenge remains the hardest to solve: humans are the weakest link in tech security. “Some general suggestions would be to avoid using public Wi-Fi and Bluetooth services since threat actors can exploit vulnerabilities,” he said.
Pavithran also says it’s good practice to avoid using personal devices for work, though data separation may help reduce the risks of bleed between corporate and personal digital data space.
From zero to hero: Apple in the enterprise
Hexnode’s clients include Volvo, Lowes, Target, Swatch and others and the company says it can support Macs, iPhones, iPads, and Apple TV alongside Windows and Android systems — all using the same management tools.
That flexibility reflects just how rapidly Apple continues to grow in the enterprise technology space. In the last decade, it has made a major transition. It is no longer an also-ran in enterprise tech, it has become a respected peer. At a time when tech itself – and the nature of the workplace – is also engaged in rapid transition, means big opportunities for those who surf the wave.
Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.
Corel Painter, one of the world’s leading creative applications, runs twice as fast on M1 Macs as on Intel-based machines, which hints at even better performance on the latest M1 Max and M1 Pro MacBook Pros.
Painter’s big Mac picture
That’s according to Corel, which told me today that introducing a fully Apple Silicon-native version of Painter 2022 unlocked “significant” performance gains. These are particularly noteworthy when you consider what the application does – since the early ‘90s it has striven to be the digital equivalent of artistic tools; it’s already in most digital creatives’ tool bags.
Corel said artists upgrading to Painter 2022 on M-series Macs will see “significant brush engine performance improvements, and we’re excited about what these boosts will mean for artists who work and live by deadlines.”
Corel, which tested its software on an entry-level 2020 MacBook Pro with an M1 chip and an Intel-based 2019 MacBook Pro, highlighted two key metrics:
Users can expect overall brush engine performance that is up to 4.7x faster on Macs with M1 chips compared to running on the same Mac using Rosetta 2.
Users can also expect the software to run twice as fast when compared to Intel-based Macs.
That’s significant, but potentially much more so for digital creatives considering an investment in an all-new MacBook Pro. Apple has told us what to expect from those machines, which are equipped with M1 Pro and M1 Max processors.
With a 30-year history on Mac, Painter is one of those unique applications to have gone through three major processor transitions with Apple. It was on the original AIM Alliance Macs, made it to the Intel Macs, and now it’s running on M-series Macs. That shared story means you can consider it a barometer for Mac performance.
(I’m lucky enough to be working with a new 16-in. MacBook Pro at present, and while I’m not ready to write the review, I can say the battery life and computational performance it is delivering is phenomenal.)
Performance machine, Apple’s ‘big beast’
We’ve also seen some additional benchmark figures leak that suggest the kinds of speed and performance benefits Apple professionals can expect from these machines.
Most recently, we saw Passmark place Apple’s M1, M1 Pro and M1 Mac chips in the top four positions of its performance benchmark charts. Prior to this, Geekbench data revealed the big advantages of these new chips, which compete with high-end gaming PCs, but require far less energy and deliver across a far wider gamut of need.
Apple’s silicon development teams appear to have figured out how to make those billions of transistors work for you.
Corel’s claims aren’t surprising, of course. Apple made its own sets of similar claims when it introduced the new Macs, claiming the 16-core GPU in M1 Pro and 32-core GPU in M1 Max (in the 14-inch MacBook Pro) provide:
Up to 9.2x faster 4K render in Final Cut Pro with M1 Pro, and up to 13.4x faster with M1 Max.
Up to 5.6x faster combined vector and raster GPU performance in Affinity Photo with M1 Pro, and up to 8.5x faster with M1 Max.
Up to 3.6x faster effect render in Blackmagic Design DaVinci Resolve Studio with M1 Pro, and up to 5x faster with M1 Max.
The 16-in. models deliver even higher performance, including up to 1.7x faster Final Cut Pro rendering and up to 4.9x faster object tracking in the 16-core chip.
Apple also cited battery life enhancements, which means third-party apps should work faster and last longer on one charge.
More improvements in the pipeline
We already know Adobe Creative Cloud applications run twice as fast on M1 Macs, and that performance achievement will only increase on the new M-series chips on MacBook Pro. With Apple reportedly plotting a path to introduce the new processors in a 2022 iMac Pro and with anticipation around the subsequent release of an all-new Mac Pro— perhaps with multiple processors — this seems to be a really good time to be a creative pro using a Mac.
I’d be interested to learn more about the level of computing performance other key application developers are experiencing as they migrate to Apple Silicon – particularly on Apple’s latest MacBook Pro. Anyone for whom time is money will certainly want to look into this.
Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.
For years now, we’ve been moving from a PC-centric IT world to Desktop-as-a-Service (DaaS) models. It’s been driven by the rise of working from home, companies wanting to secure their end-users, and the convenience for IT of centrally managed user end-points. There was only one hitch: A handful of popular programs were only available on PCs — chief among them, Adobe Photoshop.
Now, finally, Adobe is bringing not just Photoshop but Illustrator to Software-as-a-Service (SaaS) users. The company unveiled its plans last week at Adobe MAX 2021.
You might think, “What’s the big deal? Hasn’t Adobe been releasing its programs as SaaS for years as Creative Cloud?”
Actually, no. Adobe Creative Cloud is neither a cloud nor a SaaS; it’s a software rental licensing business model. True, you can share files using its infrastructure-as-a-service (IaaS) storage, but you could always do that with network file-sharing or third-party cloud services such as Dropbox.
So, if you thought Photoshop’s been available in the cloud all along, you thought wrong. This more is a case of cloud-washing: Slapping a cloud coating on essentially the same old program.
To use Creative Cloud Photoshop, or Creative Cloud Anything, you had to download a fat client to use it. Despite the name, it’s not a SaaS offering. Today, if you want to run Creative Cloud Photoshop, you need a high-powered Windows PC or macOS Mac.
Soon, that will be another story.
With the new Photoshop and Illustrator betas, users can share projects with anyone with a web browser. They can view and comment on your Photoshop or Illustrator files. And Creative Cloud subscribers can make light edits to those same files from their browsers.
This is only the beginning. As Adobe explains: “The journey of bringing Creative Cloud to the web starts now with Photoshop on the Web in public beta, while Illustrator on the Web is debuting in private beta.” Eventually, most Creative Cloud programs, such as InDesign, Acrobat DC, and Fresco will be available as SaaS programs. And even video-editing tools such as After Effects and Premiere Pro will be available on DaaS setups.
In the meantime, Adobe is enabling real cloud use with its new Creative Cloud Spaces and Creative Cloud Canvas. Spaces is a shared repository of files for your team members. Canvas lets you show your images and documents in a single visual space. (Think of it as a virtual bulletin board and you wouldn’t be far off the mark.)
Put this all together and soon you’ll be available to run the Creative Cloud software suite on a Chromebook, with Windows 365 Cloud PC, or some other DaaS program.
Of course, if you’re an Adobe Photoshop pro, you’re still going to want a high-powered Mac or PC. When it comes to image and video manipulation, there’s no such thing as enough CPU and GPU horsepower. Officially, Adobe recommends GPUs with an average Ops/Sec benchmark of 2,000 or higher on PassMark’s GPU Computer Benchmark Chart. My friends who make their living from Photoshop won’t even turn on a workstation unless its graphics has a Passmark Direct Compute score above 10,000.
But how many Photoshop users really need that kind of power? If all you’re doing is rendering images and doing simple edits such as cropping, navigating layers, adjusting exposure, and annotating images, the forthcoming pure cloud Photoshop program is all you’ll need. Adobe calls its new web-based tools sufficient for “light editing.” That’s exactly right.
True, power users will always need high-end hardware, PC-centric operating systems, and programs and files stored locally on their hardware. But with the Adobe family coming to DaaS, most users will do just fine with SaaS and cloud-based operating systems.
This move isn’t just good for Photoshop users. It puts the Adobe stamp of approval on the switch to DaaS as well. I’ve said it before, I’ll say it again, IT is on its way to a DaaS world.
Android 12 may seem like old news to those of us in the land o’ Pixels at this point, but hold the phone: Google’s latest software has some pretty phenomenal features that are lurking beneath the surface and all too easy to overlook.
We explored a dozen such treasures the other day, but there’s even more juicy goodness where that came from. So here now are seven more spectacular hidden gems you’ll absolutely want to dig up in Android 12 on your Pixel phone — regardless of whether you’re packin’ the new Pixel 6 or Pixel 6 Pro or one of the older Pixel models.
Check ’em out, get yourself in the habit of using ’em, and then come sign up for my free Pixel Academy e-course to uncover even more hidden Pixel magic.
New Pixel trick No. 1: Fast link-grabbing
Android’s Overview area — y’know, the card-driven app-switching interface you see when you swipe up from the bottom of your screen and then hold your finger down, using the current Android gestures system — has gotten some seriously cool superpowers on Pixels as of late. And with Android 12, your favorite Googley phone has another tucked-away time-saver to discover there.
So here it is: Anytime you look at Chrome in your Overview area, you can grab the link of the currently open page with one quick tap — without ever leaving Overview or opening the app. And from there, it’s just one more press of your precious fingie to copy the link or share it anywhere else on your phone.
How to find it
This one’s easy:
Open the Overview interface by swiping up from the bottom of your screen about an inch and then stopping — or, if you’re still using the old legacy three-button nav system, tap the square-shaped button on your phone’s lower edge instead.
Find Chrome in your list of recent apps.
Look for the curiously shaped link icon in the thumbnail’s upper-right corner.
Tap that icon, and shazam! You’ve got your link.
New Pixel trick No. 2: Quick device controls
One of Android’s best buried treasures is the device control panel introduced in Android 11 and then weirdly tucked away to an out-of-the-way place in Android 12. Fear not, though, for your Pixel’s fancy new software has a couple cool ways to bring that panel back to the forefront.
The panel, in case you aren’t familiar, gives you easy access to adjusting any connected devices associated with your account — smart lights, cameras, thermostats, speakers and displays, e-rodents, you name it. It’s an awesome time-saver, but it’s up to you to pull it out of Android’s bowels and make it accessible.
How to find it
First things first, you need to download the confusingly named Google Home app, if you haven’t already, and make sure you’re signed into it and set up with any connected devices you’ve got. And don’t let the app’s name fool you: It’s essentially just an interface for interacting with any and all connected gadgets, no matter where they are or what purpose they serve (ahem).
Once you’ve got that up and running:
Swipe down twice from the top of your screen and tap the pencil shaped icon in the lower-left corner of the Quick Settings panel.
Scroll down until you see the “Device controls” tile. It may well be at the very bottom of the list, way down in the inactive tile area.
Touch and hold your finger to that tile and drag it all the way up. I’d suggest putting it into one of the top four positions, which will make sure you can always see and access it above your notifications with just a single swipe down from the top of your screen.
And there’s another new Android-12-added Pixel feature along those same lines…
New Pixel trick No. 3: A smarter lock screen
In addition to showing basic info like the time, weather, and any pending notifications, your Pixel’s lock screen can offer up a one-tap shortcut to that connected device control panel. In fact, the shortcut might already be there, but there’s a decent chance you haven’t noticed it or realized what it does.
How to find it
Once you have the Google Home app installed and configured, look for a subtle house-shaped icon in the lower-left corner of your Pixel’s lock screen:
See it? Purty, right? Tap that house and tap it good, and you’ll be staring at your connected device controls faster than you can say “Hey Google, house it goin’?” (as one does).
If you have the Home app set up but don’t see that icon on your lock screen:
Panic and scream loudly. (Just kidding. Don’t actually do that.)
Open up your system settings and head into the Display section.
Tap the line labeled “Lock screen.”
Look for the “Show device controls” toggle and make sure it’s in the on position.
While you’re there, take note: You can also turn on a toggle in that same area to show an icon for your wallet on your lock screen and keep your Google Pay card selector one tap away. You’ll need to have the Google Pay app installed and configured in order for that one to work.
New Pixel trick No. 4: The kill switches
Among its many under-the-hood privacy enhancements, Android 12 introduces a pair of new kill switches that make it super-easy for you to disable your Pixel’s camera or microphone anytime the need arises. They also have the advantage of allowing you to talk about using “kill switches,” which makes you sound totally hard-core…until people figure out that you’re actually talking about an advanced privacy feature on your Google Pixel phone.
But, hey, I won’t tell if you don’t. So let’s find and enable those options, shall we?
How to find it
We’ll need to mosey our way back into your Pixel’s Quick Settings editor to dig these lovely fellas up:
Once again, swipe down twice from the top of your phone’s screen and tap the pencil shaped icon in the lower-left corner of the Quick Settings panel.
Scroll all the way down to the inactive tile area at the bottom of the list and look for the tile labeled “Mic access.”
Press and hold your finger to that tile and drag it up into the active area of your Quick Settings — in whatever position you like. (Just remember that the first four tiles are the ones you’ll see with a single swipe down from the top of the screen, while the first eight are the tiles you’ll see with two swipes down. Anything beyond that will require an additional swipe to the side to access.)
Repeat that same process with the “Camera access” tile.
Whoop vigorously in celebration.
After that, it’ll just be a single tap on either of those tiles to turn off the associated function and know no one can possibly see and/or hear whatever it is you’re up to (you naughty, naughty bird).
New Pixel trick No. 5: Color coordination
Android 12’s Material You theming system is surprisingly impressive, but for some reason, one of its most prominent elements seems to be disabled by default on Pixels that are upgrading to Android 12.
That element is the auto-theming of icons on your home screen so that they’re coordinated with your current wallpaper and the rest of your phone’s dynamically changing motif. It’s a subtle touch and a purely superficial one, but it really does create a nice effect and make a meaningful difference in how pleasant your phone is to use.
How to find it
Provided you’re using your Pixel’s default Pixel Launcher home screen setup:
Press and hold on any blank space on your phone’s home screen.
Tap “Wallpaper & style” in the menu that comes up.
Scroll down on the next screen you see until you find the “Themed icons” option.
Flip its toggle into the on position.
Pretty spiffy, wouldn’t ya say? Just note that as of now, it’s mostly only Google apps that are affected by this system. It’ll take some time for non-Google app developers to get on board with the system and start empowering their apps to be a part of it — but with any luck, we’ll see more of that soon.
New Pixel trick No. 6: A simpler Assistant shortcut
Google Assistant’s got tons of worthwhile time-savers, with new options showing up all the time (especially here in Pixel Land!). But uttering “Hey Google” isn’t always the most practical way to summon your trusty virtual companion. And Android’s swipe-up-from-the-corner-of-your-screen gesture for activating Assistant is often awkward and inconsistent (which might be at least in part why Google disabled it by default on the new Pixel 6 devices).
Well, take note: As of Android 12 on a Pixel, you’ve got another option — and it’s about as easy as can be: A simple press and hold of your phone’s physical power button can pull up your friendly neighborhood genie and have it standing by to help with whatever you need.
B-b-b-b-b-but — oh, yes — you’ve gotta find and activate the feature first.
How to find it
Thankfully, there’s not much to this process:
Open up the System section of your Pixel system settings.
Look for the line labeled “Press and hold power button,” and tap it.
Activate the toggle next to “Hold for Assistant.”
That’s it: Now just press and hold that protruding ol’ power button of yours, and your Assistant will be there in a flash. When you want to power down your phone or access any other items in the traditional Pixel power menu, press the power button and the volume-up key together (and that’ll override the previously available muting shortcut connected to that same key combination).
New Pixel trick No. 7: Tap action
If you love shortcuts as much as I do, you’re gonna adore this final Pixel-specific Android 12 treat. It’s a new system that lets you tap twice on the back of your phone to have a specific action performed — capturing a screenshot, summoning Assistant, playing or pausing any active media, opening your recent apps, opening your notifications, or opening any app you want (whew!). As an added bonus, it also makes a lovely percussive pitter-patter.
So, yeah: This new Pixel option is pretty forkin’ fantastic, to say the least. The only catch is that is seems to be available only on the Pixel 4a 5G and higher — so the Pixel 4a 5G, Pixel 5, Pixel 5a, and then the new Pixel 6 and Pixel 6 Pro models (but not, curiously enough, the regular Pixel 4 or Pixel 4a devices).
How to find it
If you have the Pixel 4a 5G or higher:
Go into the System section of your phone’s settings.
Tap the line labeled “Quick Tap.”
Flip the toggle at the top of the next screen into the on position.
Select which action you want your taps to trigger.
If you go with the “Open app” option, be sure to hit the gear-shaped icon along that same line to select which app, specifically, you want to have open.
And if you find your phone is activating the tap action inadvertently, when you aren’t actually tap-a-tap tappin’, head back into that same area of the settings and try activating the “Require stronger taps” toggle to see if that helps.
Not a bad bag of tricks to have, eh? And hey, don’t forget to come join my free Pixel Academy e-course to keep the momentum going. You’ll get seven spectacular days of efficiency-enhancing Pixel knowledge — starting with some camera-centric smarts and moving from there to advanced image magic, next-level nuisance reducers, and oodles of other opportunities for pro-level Pixel intelligence.
Your Pixel productivity journey is just getting started, my dear Pipsie, and we’ve got plenty more ground left to cover yet.
One can make the argument that Apple created the phenomenon of shadow IT when it introduced the iPhone and the App Store. Suddenly managers and individual users had the ability to source their own business software and services, bypassing IT departments completely. And they could do so with devices not connected to a corporate network, preventing IT from even realizing shadow IT was happening in their organizations.
Apple did step in a couple of years later, providing an enterprise mobile device management (MDM) platform that allowed IT some control over devices in their organization. But to be effective, IT still needs to partner with line of business managers and individual users. After all, users can simply use devices not enrolled in MDM if they choose.
Fast forward a decade from the introduction of MDM, and Apple is again creating a potential shadow IT nightmare in the form of iCloud Private Relay.
What is iCloud Private Relay?
iCloud Private Relay is a new privacy feature in iOS 15 (available today but still in beta) for users with paid iCloud accounts, now known as iCloud+ accounts. And it is generally a good consumer privacy protection system.
Available on iPads and iPhones since iOS 13, Shortcuts are now available in macOS Monterey. These automations are designed to simplify repetitive tasks, but do you know how to use them?
Get familiar with the Shortcuts app
If you’ve used Shortcuts before, the user interface should seem familiar — particularly if you’ve used them on your iPad. The application window uses Apple’s now customary left-hand side bar with buttons to take you to Gallery, All Shortcuts, Quick Actions, Menu Bar, and a Folders section.
All Shortcuts combines all the Shortcuts you may already have created on an iOS device, along with a very short collection of Starter Shortcuts. These starting points are selected on your behalf by Siri and reflect what it has learned about how you use your devices.
That’s useful, but what is shown may not reflect the tasks you want to get done on your Mac – even Apple’s powerful new M1 Pro and M1 Max Macs, which arrive today. That Shazam shortcut makes more sense on your iPhone than it does on a new M1 iMac, for example; to get to the productivity enhancement tools, you’ll ned to click Gallery.
Gallery provides a wide array of pre-built shortcuts organized into selected groups, including Siri-related shortcuts and shortcuts to get things done, and a variety of essentials, including accessibility. You can combine one of more of these pre-made Shortcuts to make new ones, or combine any of the many single actions supported by the application to create completely new shortcuts for you.
How to build a Shortcut
Say you want to create a Shortcut to automatically open two specific apps to run alongside each other in Split Screen mode:
In the Search Gallery, input at the top right of the Shortcuts app type ‘Split Screen 2 Apps’ until the relevant Shortcut appears.
Tap this to get to the description page.
You’ll find that this Shortcut lets you set up two apps to work side-by-side, and works with Apple Watch. That means you can ask Siri on your watch to open Safari and Mail alongside each other (once you define those apps).
Click Add Shortcut to customize it — in this case to choose the two apps you want to use in Split Screen.
Once you’ve set the Shortcut, you can drag it into the Menu bar by taking it from the Gallery view into the Menu folder to the side of the application.
You can also double-click the Shortcut to edit it, and then in its Settings icon choose Keep in Menu bar.
Creating Quick Actions
You can create Quick Actions that you can access from anywhere in the Finder. Open the Shortcut, click its Settings icon, and then choose “Use as Quick Action” and either Finder and/or Services menu.
The Settings item also lets you select Add Keyboard Shortcut. When you enable this, the item becomes a Quick Action and is also made available as a Service. You will then need to create the keystroke command you want to use to run that Shortcut.
How to run a Shortcut
You can run a Shortcut in multiple ways, though you may need to enable the Shortcut to appear in some of these (as a Quick Action, Menu item, etc.):
Tap it to play it in the Shortcuts app.
You can pin some to your Menu bar. This consists of a Shortcuts icon, below which you’ll find a drop-down list of all available Shortcuts.
You can ask Siri to run a shortcut, so long as you’ve named it and enabled it.
You can keep the Shortcuts app in your Dock. When you right-click that item in your Dock, you may find the shortcut you need in Open Recent or Run Shortcut.
You can also right-click an item (an image, for example) to run any available Shortcut via the contextual menu. You need to make it a Finder Quick Action first.
You can run them as a Keyboard shortcut (see above).
You can also add a Shortcut to run as a Service in the application menu.
What about third-party apps?
Some developers are offering up Shortcuts for their apps. One good example of this is Pixelmator Pro 2.2, which has introduced 28 of its own shortcuts actions along with its recent macOS Monterey update. These include color adjustments, auto-cropping, rotate and resize, mask image, ML Match Colors, ML Super Resolution (an amazing feature, by the way), and file format conversion shortcuts.
Pixelmator has also put together a couple of new Shortcuts-exclusive features, including automatic background removal for photos of people. It means that you can simply select an image using Control-Click to apply a shortcut and your Mac will do the rest.
What about my old workflows?
If you’ve already used Automator or Apple Script, you may be wondering what will happen to that work. Apple says Shortcuts is the future of automation on its platforms, but has also made it possible to import Automator workflows into Shortcuts. The company also says AppleScript and Automator will continue to be supported on the Mac for some time yet.
To import Automator workflows into Shortcuts, either right-click the Automator flow in the Finder and open it in Shortcuts, or click and drag the item into the Shortcuts app. You may find that some Automation scripts won’t work because they rely on workflow routines Shortcuts doesn’t yet possess, though this may change over time.
Apple Script, Java Script, and shell scripts are supported in Shortcuts; just paste the script into the provided Shortcut actions for running those scripts. You will also need to open the Advanced tab in Shortcut application preferences to enable scripts.
Apple offers extensive information on running Shortcuts on a Mac. But, for me, the deepest look into how they work is always available at MacStories. They love this stuff.
Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.
Of the Three Amigos, Ive always found James May to be the most progressive. His interests are wide-ranging and he usually appears to be quite open-minded on camera. Today, I learned May owns both a Tesla Model S and a Toyota Miraiand he brings up a great poin…