Nintendo’s ‘Kirby Air Raiders’ for Switch 2 Gets ‘Rare’ Accessibility Menu, Report Says
Quinton O’Connor at The Gamer reported last week Nintendo has supplied “a plethora of detailed accessibility options” to its Kirby Air Raiders title for the Switch 2. The game was shown during a Nintendo Direct presentation featuring director Masahiro Sakurai.
“Nintendo’s not always on the ball about this [accessibility] stuff, and while many of us won’t tap into these features, some absolutely will—and in certain cases, they simply can’t play the game without them,” O’Connor wrote last Friday. “It’s worth highlighting just how much attention accessibility has gotten during development.”
O’Connor’s story includes a social media post from Steve Saylor, who wrote on Bluesky he was “pretty impressed” by the fact Kirby Air Raiders has its own Accessibility submenu—of which he noted was “rare” for Nintendo to include—and features options such as button remapping, text size adjusting, and more. In addition, Kirby Air Raiders has a “camera shake” toggle under the aforementioned Accessibility menu, which Sakurai encouraged players to “please try out… to find something that works for you.”
As someone who’s decidedly not a hardcore gamer—my favorites are sports titles—I admittedly was a little surprised to learn of Nintendo’s lackluster reputation in the accessibility arena. Back in April, I wrote about being intrigued by the Switch 2; in my piece, I noted Nintendo is involved with the Entertainment Software Association’s (ESA) new Accessible Games Initiative. Moreover, I interviewed ESA senior vice president Aubrey Quinn about the development of the Accessible Games Initiative in April as well.
Apple, NBCUniversal Announce ‘first-of-its-kind’ Apple tV × Peacock Subscription Bundle
Earlier this month, Apple and NBCUniversal put out a press release wherein the two companies announced a new bundle that couples Apple TV with NBC’s Peacock.
The bundle became available this week, on October 20.
“Customers in the U.S. can save over 30 percent by subscribing to the Apple TV and Peacock Premium bundle for $14.99 per month, or Apple TV and Peacock Premium Plus for $19.99 per month, through either app or website,” Apple said of its newfound subscription bundle in the announcement. “Apple One subscribers on the Family and Premier plans can subscribe to Peacock Premium Plus and receive a 35 percent discount—the first benefit of its kind for Apple’s all-in-one subscription bundle.”
As a subscriber of both Apple TV and Peacock—as well as Apple One—I was excited to learn of this news. My affinity for Apple TV is well-documented, but I’m also a fan of Peacock. Indeed, I signed up for this deal via the Peacock app on tvOS and it was great.
I’m planning to start watching The Paper soon, and I’m really excited that the NBA is back on NBC in addition to watching Sunday Night Football every week. From an accessibility perspective, the Apple-NBCUniversal bundle is notable insofar as it lowers barriers to entry for both streaming services. Whatever discretionary income a disabled person has to play with, someone who might be like me and loves both Apple TV and Peacock qualifies for a substantial discount on the monthly cost. What’s more, that the two are bundled together means that a person needn’t worry cognitively about managing bespoke subscriptions. In other words, the salient point is not only are Apple TV and Peacock made more financially feasible, the fact they exist as a bundle makes managing the services more accessible too. These are ostensibly mundane points, but therein lies the rub: the tiniest details can make the biggest difference for someone with disabilities. As I say, accessibility lies everywhere in the lives of disabled people.
News of the tv-Peacock deal comes soon after Apple dropped the “+” from Apple TV.
Apple TV’s ‘CODA’ Gets Blu-ray Release Date
Ryan Christoffel reported for 9to5 Mac earlier this week Apple’s CODA is going from streaming to solid state, as the film soon will be available on Blu-ray and DVD discs.
Pre-orders are happening now ahead of the disc’s release on November 4.
“This [CODA Blu-ray] release seems to be part of a growing trend of streamers being more willing to offer physical movies for purchase,” Christoffel wrote on Wednesday. “Despite previously favoring streaming-only availability, Apple now appears more open to offering Blu-ray versions of its movies.”
As Christoffel notes, CODA has tremendous historical significance as its 2022 Best Picture win—coupled with Troy Kotsur’s Best Supporting Actor victory—are firsts not only for Apple and Apple TV as the first streaming-only film to win Hollywood’s most prestigious award, but also for Kotsur as the first Deaf actor to win an Oscar. As a literal CODA myself, the movie holds a special place in my heart not merely for the subject matter, but for the representational gains it affords CODAs and the Deaf community.
While I, like most people, am a huge proponent of streaming content, I have begun to amass a small collection of my most favorite movies in 4K Blu-ray to go along with my dozens of DVDs from two decades ago. In fact, just yesterday I took delivery of Conclave from Amazon, about a fictitious papal election, and plan to pre-order CODA soon too. (I also have Season 1 of Severance in keeping with the Apple TV theme.) I’m pretty picky about which Blu-rays I’m collecting; space is a precious commodity at my house, so I’m hellbent on getting only the films I cherish most. From a technological perspective, I’m also motivated by the fact, as somewhat of a home theater nerd, the best picture quality comes from physical media versus streaming, and I’d like to take fullest advantage of my spectacular 77” LG C3 OLED in every way possible. I have the critically-acclaimed Panasonic UB820 Blu-ray player hooked up to said TV. For those times when I want to have a movie night with, say, CODA, I can pop the disc into the device and enjoy (again).
I should mention too that there indeed is a pro-accessibility argument for streaming versus Blu-ray. First and foremost, there are the costs associated with getting a good Blu-ray player itself, let alone the discs. More to the point, however, is the physicality to physical media. To wit, pressing a button on one’s Siri Remote may be far more accessible than opening the disc’s case, taking out the disc, and putting it into the player without scratching the disc, etc. For many people with certain motor disabilities, Blu-ray may be a nonstarter for this workflow. It’s a valid reason, and illustrative of the greater societal shift to embrace streaming music and movies for convenience’s sake.
News of CODA on Blu-ray comes not long after it was announced by Apple earlier this month F1: The Movie starts streaming in December. It’s also now available on Blu-ray.
Yours Truly Guests on the ‘Access All’ Podcast
Late last week, I received an email from a producer at BBC News who asked if I’d be interested in coming on the network’s podcast about disability, called Access All, to discuss my experiences using Waymo for my transportive needs in San Francisco. I happily agreed to come on, and we recorded the episode on Monday. To my surprise and delight, it posted earlier today for everyone to listen to in their favorite podcast app.
Besides yours truly, the episode features my friend Amanda Ventura at Waymo, as well as Emma Vogelmann. Vogelmann is co-CEO at UK-based organization Transport for All.
My cameo is timely, as it coincides with Waymo’s announcement it will be bringing its autonomous vehicles across the pond beginning in 2026. In a statement accompanying the news, Waymo’s own co-CEO in Tekedra Mawakana—whom I’ve interviewed in the past—said the company is “thrilled” to soon bring Waymo to Londoners, adding its service is “making roads safer and transportation more accessible where we operate [and] we’ve demonstrated how to responsibly scale fully autonomous ride-hailing, and we can’t wait to expand the benefits of our technology to the United Kingdom.”
iOS 26.1 Beta 4 Adds Liquid Glass Appearance Toggle
My friend Stephen Hackett reported earlier this week Apple has, in iOS 26.1 Beta 4, added a new toggle for Liquid Glass which allows users to choose between two looks: Clear and Tinted. As Hackett writes, the former represents Liquid Glass in its default state, whereas the latter increases contrast and opacity levels. The setting is found in Settings > Display and Brightness on iOS, and System Settings > Appearance on macOS.
Hackett notes Apple’s placement of the Liquid Glass control is surprising as he believed “a control like this would land in Accessibility.” However a logical presumption, it’s not really a hardcore accessibility feature. Here’s the thing about accessibility features, on iOS or Android or anywhere else: they’re a discrete, specialized—and admittedly esoteric—suite of settings intended to meet highly specific needs. While it’s true “accessibility is for everyone” for the most part, the majority of the options under Accessibility are decidedly opinionated in their target demographic. Take the AssistiveTouch pointer, for instance. AssistiveTouch is itself a subset of specialized features aimed at aiding those with motor disabilities; as such, the pointer has been expressly designed to serve that greater purpose. This focus is why, as one anecdote, the response to the AssistiveTouch pointer rankled me a few years ago. Apple added it for a reason. Indeed, the company’s senior director of global accessibility policy and initiatives Sarah Herrlinger told me in an interview at the time it “isn’t your traditional pointer,” yet the iPadOS power users were so desperate for proper pointer support, many in the community appropriated the AssistiveTouch pointer and lamented how it doesn’t work like your aforementioned traditional pointer. But it wasn’t conceived to be conventional… my understanding is Apple’s Accessibility group “handed off” the AssistiveTouch functionality to the wider OS team to be further massaged into the mainstream pointer feature that exists today. I got pushback from a lot of people for explaining all of this under the notion AssistiveTouch isn’t perfect and warrants criticism… to which I still say, sure, but it isn’t meant for you and your nerdy whims.
Anyway, the fact the new Liquid Glass in iOS 26.1 Beta 4 is not a true accessibility feature is sensible in the same way the Display Zoom options (also under Display and Brightness) isn’t found in Accessibility. They’re more about personal preference than absolute necessities like, say, the PWM toggle for iPhone 17 models. If desired, it can disable “pulse width modulation,” which Apple says “[provides] a different way to dim the OLED display, which can create a smoother display output at low brightness levels.” It exists to help people who may perceive these changes in brightness, as the detection of flickering can be bothersome to some and can cause pain such as eye strain and headaches. By contrast, the advent of the Liquid Glass toggle really isn’t that serious, as the kids say. It’s more about a preferred aesthetic than it is about actual accessibility.
On that note, I’ve heard many on social media bemoan the arrival of this new setting because it telegraphs the message that Apple is capitulating to the criticism from the summer beta period and effectively abdicating their position on Liquid Glass. But that’s not right—a true abdication would entail scrapping Liquid Glass altogether and bringing back the iOS 18 design language. Adding a setting for choosing between Clear and Tinted is the company merely giving users the power to decide between the “full” Liquid Glass experience and something that’s slightly more “normalized” for lack of a better adjective. And for those who need the utmost contrast and visual fidelity, they can go into Accessibility and flip on Reduce Transparency to extend the Tinted look even further. As I wrote following the WWDC keynote, I sat with Herrlinger for a few minutes after the presentation, and while I didn’t conduct a full, on the record interview, I was able to attribute to her that the Accessibility team worked “closely” with the Design team to make Liquid Glass as visually accessible as possible. Reduce Transparency will always be there if you really and truly require the extra oomph it offers, she said to me.
Apple isn’t retreating from Liquid Glass—on the contrary, they’re refining it.
I’m no longer riding the beta train, but I enjoy Liquid Glass a lot. I anticipate not changing its appearance once iOS 26.1 ships. I like Liquid Glass so much—and use the Medium Display Zoom on my iPhone Air with larger text—that I don’t use Reduce Transparency because my eyes have acclimated to, and can bear, the default. Plus, I think Liquid Glass looks fucking cool as-is, and I want to use it with the creator’s intent.
NBC News Layoffs a sobering reminder people like me Have the Ricketiest Seats at the Media table
Corbin Bolies reported sobering news for The Wrap earlier this week that NBC News has decided to gut its newsroom’s bespoke diversity coverage teams. He writes the move is part of recent layoffs, which sees the Peacock network say goodbye to 150 personnel.
The cuts comprise 7% of a newsroom of “about 2,000 staffers,” according to Bolies.
“The cuts mean that the verticals NBC BLK, NBC Asian America, NBC Latino and NBC OUT will no longer have dedicated teams bolstering their coverage,” Bolies wrote on Wednesday. “The verticals will continue to publish stories related to the specific groups and NBC News may ultimately retain up to five staffers who will contribute coverage on the verticals to the newsroom, according to one source, as the dedicated teams focused exclusively on these verticals are sunset.”
The latest layoffs, which Bolies reported a source said was described by NBC News’ executive vice president of editorial, Catherine Kim, as “a difficult day for a lot of us,” were announced during a brief morning meeting conducted earlier this week.
While Bolies notes NBC News may retain a small number of people to write stories about underrepresented groups, I found the move distressing and unsurprising. If you know me on LinkedIn, you may know I occasionally post these “Hire Steven” missives in which I stand on my proverbial soapbox and tell anyone who’ll listen that I’m open to a staff role someplace. The thrust of my pitch is simple: I like to think I’m a damn good journalist… I know how to write well, and my lived experiences as a lifelong disabled person (coping with multiple conditions at that) gives my work instant credibility. The larger argument, of course, is how, in my 12 years in the tech media racket, earnest disability coverage is woefully inadequate. Too often, accessibility coverage is relegated to weeks and days, and oftentimes what accessibility stories that are published are cobbled together by well-meaning yet able-bodied writers who aren’t intimately familiar with the topic. For my part, I’ve dedicated my career to making marginalized people like me more seen, but my reach goes only so far as an independent journalist. There are definite perks to being indie, but there surely are perks to being part of a newsroom of an established outlet. I’ve been doing this a long time, and walking this beat is a damn lonely road. What’s more, I’d like to make more money and, frankly, explore what other challenges/opportunities may be out there.
I’ve applied to innumerable staff writer jobs over the last few years to no avail. I even had a prominent editor at a very large and well-known newspaper tell me once during an interview disability inclusion in tech is “way too niche” for their readership. I found the comment galling and dispiriting, let alone shortsighted and borderline ableist. Anyway, the salient point of this piece is not to bemoan my own job prospects, but to use my experiences as an illustration that, to a point Bolies made in his story, DEI initiatives are part of the journalism world’s “larger retreat” from such efforts. Again, as a marginalized person owning a marginalized beat, it can be difficult sometimes to keep pushing forward with my work not because I’m disinterested—on the contrary, I love being in journalism—but because (a) it’s becoming evermore abundantly clear people like me live at the margin’s margin; and (b) I increasingly feel like my byline hurdles itself into a black hole, swallowed up by ostensibly more pressing, more “glitzy” stories.
Like I said, it’s fucking lonely job—and a sentiment I’ve espoused before.
So much is made about gender, race, and sexuality. June is Pride Month, and every news station makes hay over it—yet July is Disability Pride Month and there is nary a peep about it. Never mind the intersectionality between, say, gay people and disabled people, but it’s as though society likes to pretend the disabled community doesn’t exist. But we do, and I’m exhausted from always trying to Oliver Twist my way through the industry and essentially justifying my existence to all the editorial powers-that-be.
Apple Unveils First Wave of M5-Powered Products
Apple on Wednesday announced the M5 chip, the company’s latest generation of custom processors. It powers refreshed iPad Pro, MacBook Pro, and Vision Pro devices.
“M5 delivers over 4x the peak GPU compute performance for AI compared to M4,” Apple wrote in extolling the virtues of its next-gen, designed-in-house silicon. “[It features] a next-generation GPU with a Neural Accelerator in each core, a more powerful CPU, a faster Neural Engine, and higher unified memory bandwidth.”
Of the new products, the iPad Pro and Vision Pro are, in my opinion, most interesting to ruminate over. They’re certainly noteworthy enough to cover here, even tangentially.
First, the updated iPad Pro. As I wrote last week, I’d been eagerly anticipating today’s news because I’m keen on downsizing to the 11” iPad Pro. As lovely as my 13” M4 model is for its big OLED screen and its conduciveness to iPadOS 26’s revamped multitasking system, the truth is I don’t use the tablet for productivity. Indeed, I mostly use it as a conduit for content consumption whilst lounging on the couch in our living room; for that, the 13” model’s large footprint makes it unwieldy to use as an iPad for reading in Safari or Apple Books or whatnot. Thus, my gut feeling is the 11” iPad Pro would much more realistically represent what I actually use a tablet to do—however well-intended I am, the reality is the 13” model is more aspirational than accurate in terms of how I use the device on a daily basis. Federico Viticci, I decidedly am not—not anymore, anyway.
I believe it’s healthy to reassess the tools I use, lest I fall into complacency and hubris.
Which model do I want? The 11” iPad Pro (in silver) with cellular and 1TB storage.
As to Apple Vision Pro, I personally don’t feel a compulsion for a new one like I do with the iPad. My M2 model still does everything I want it to do—which admittedly isn’t a whole lot—and I’m one of those rare birds who likes the Solo Knit Band that originally shipped with the headset. It’s comfortable and fits me well, but the advent of the new Dual Knit Band (available at $99 for OG Vision Pro users like me) serves as a reminder that, as ever, hardware accessibility matters. To wit, it matters how accessible this new Dual Knit Band is to take on and off, as well as adjust for comfort. After all, Apple’s fancy-pants $3,500 “spatial computer” does no good if you can’t easily get it on your face first. Overall, count me with my good friend John Gruber, who writes today “I find using [Vision Pro] with the Solo Knit Band comfortable for as long as I care to use it.”
Finally, a cursory note on external display support. The M5 chip enables iPad Pro to connect to external monitors at up to 120Hz, which Apple says is “ideal for creative workflows like video editing as well as gaming.” This bit of nerdy arcana appeals to me insofar as I’m also excited for refreshed external displays from Apple. My white whale display is the Pro Display XDR, although intellectually I’m well aware the Studio Display would be sufficient for my desk’s needs. (I like the Mini-LED tech and larger screen of the XDR.) I say this because I’m still using my 6-year-old Retina 4K 21.5” iMac that runs not on Apple silicon, but Intel. macOS Tahoe dropped support for my machine, and due to technological inertia and general life stuff, I have yet to move more deliberately on upgrading my desk setup. I’m lucky that this iMac still runs pretty well given its ancient age, and even luckier my workflow as a journalist involves little more than dealing with plain text files, video calls for interviews, and basic web surfing. While I’m appreciative of lots of horsepower because I’m a nerd, my usage doesn’t exactly scream “bleeding edge” compute power. Markdown files are extremely small and very low maintenance.
I should upgrade my Mac as soon as possible, but I’m doing okay for right now.
As for the M5 iPad Pro? I must get me an 11” version posthaste.
‘F1: The Movie’ Starts Streaming In December
Apple on Monday announced via press release that its summer auto racing blockbuster F1: The Movie will be available to stream beginning Friday, December 12. The film, produced by Jerry Bruckheimer and starring Brad Pitt, casts Pitt as the so-called “greatest that never was,” Sonny Hayes, a driver whose career was derailed by an accident at a race, recruited decades later by an ex-teammate to save a flailing F1 team.
“It’s been thrilling to see audiences around the world embrace ‘F1 The Movie’ in theaters,” Bruckheimer said in a statement for the announcement. “Now, we’re beyond excited to bring this exhilarating, cinematic ride to fans everywhere through Apple TV’s unparalleled global reach. Partnering with Apple and Formula 1 has been an incredible journey, and we’re proud that even more viewers, from every corner of the world, will get to experience the heart-pounding action and passion that defines ‘F1 The Movie.’”
Apple posted a trailer to YouTube (embedded below) revealing the streaming date.
Notably, my friend Stephen Hackett linked to a MacRumors story in which writer Eric Slivka picked up on a sly bit of product news: Apple TV+ is apparently now known as Apple TV. Indeed, Apple’s single line of copy in the press release reads in part the change represents a “vibrant new identity” for the streaming service. It should be noted, however, as of this writing, the Apple TV website still uses the old nomenclature. I decided to cover the F1 news partly for the name change, but also because it reminded me of something that’s bothered me about “Apple TV” as a branding for some time now.
To wit, Apple TV can pertain to one of three things: the app, the service, or the box. From a disability perspective, primarily for cognition’s sake, it can be awfully difficult to distinguish the three entities. Apple sells a box called Apple TV 4K which includes both the Apple TV app and, within it, the Apple TV service. That’s confounding for anyone, let alone someone coping with an intellectual disability. When most people think of Apple TV, they think of the service where Severance and Ted Lasso lives—in that sense, I’d guess that’s what drove the name change. Nonetheless, the naming strategy remains messy with the aforementioned box and software. I’ve heard many people say Apple TV is “built in” to, say, Samsung televisions so you needn’t go to an Apple Store and buy one when they’re conflating two different things. They mean Apple TV the service when they allude to the Apple TV box running tvOS. Apple should clean up its mess; the situation gives off strong Spider Man meme vibes. Lest you think I’m making a mountain out of a molehill, my contention is that words matter. Clarity matters. Even if Apple TV the service gets the lion’s share of the cachet in terms of name recognition, it doesn’t mean the others ought to mire amidst a super convoluted game of word association.
See also: back in June, I wrote about the accessibility of the movie’s “haptic trailer.”
F1: The Movie grossed $629 million at the box office worldwide, according to Apple.
Apple Discontinues Clips App
Some weekend news from Eric Slivka at MacRumors: Apple has cut the cord on Clips.
“Apple has essentially discontinued Clips, its video-editing app designed to allow users to combine video clips, images, and photos with voice-based titles, music, filters, and graphics to create enhanced videos that can be shared on social media sites,” Slivka reported on early Saturday morning. “The app has been removed from the App Store, and a support document on Apple’s site says that the app is no longer being updated and would no longer be available for download for new users as of yesterday.”
Clips has been a mainstay of the various iPhones I’ve used since debuting in 2017, although I can’t recall the last time I used the app—or even launched it for something. I remarked recently how Apple has a penchant for letting its App Store apps wither on the proverbial vine—which, as Slivka rightly notes, the company apparently did over the last few years by updating Clips only to address bug fixes and basic maintenance. As a diehard sports fan, my hope is Apple Sports isn’t destined to match Clips’ fate whilst I wait for Apple to someday make an iPadOS version. As I’ve often said in my coverage of Apple Sports, the app is one of my most favorite—and most heavily used—on my phone.
I remember covering Clips at the time of its introduction because, as ever, there were accessibility ties. To wit, Apple was boastful of the fact the app could generate real-time captions for its short-form videos; the captions were useful, of course, to Deaf and hard-of-hearing people so as to make dialogue more accessible and inclusive. Back then, I remember thinking how inspired it was given TikTok and Instagram Reels had yet to pervade the mainstream consciousness. Nowadays, the vast majority of these videos I see all have live captions enabled by default, and it’s heartening to notice the change as a lifelong disabled person who, coincidentally, has a level of congenital hearing loss.
NBA’s Los Angeles Lakers to stream select Games This Season Formatted for Apple vision Pro
My good friend Jake Krol at TechRadar reported earlier this week the NBA’s Los Angeles Lakers will be using Apple Vision Pro to stream games in Apple’s Apple Immersive Video format for the mixed-reality headset. The streams will be available in both the NBA and Spectrum SportsNet (the Lakers’ broadcast partner) apps for visionOS early next year.
“It’s not every game, but for those that are streaming—exclusive to the $3,500 Spatial Computer—you’ll get access to views that put you right in the middle of the action,” Krol wrote of the Lakers’ plans for Vision Pro and Apple Immersive Video. “Special cameras that support the format will be set courtside and under each basket to give you perspectives that amp up the immersion. The Lakers’ games will be shot using a special version of Blackmagic Design’s URSA Cine Immersive Live camera.”
Apple and the NBA will announce which games on the schedule will be available in Apple Immersive “later this fall,” specifically sometime next month, according to Krol.
As a diehard sports fan—which includes basketball—I’m extremely excited by this news. For one thing, I do have a Vision Pro I mainly use for entertainment: watching TV shows and movies. For another, Krol’s story strikes me as lending more credence to my notion that Vision Pro can make watching sports more accessible to Blind and low vision people. Granted, this isn’t the same as sitting inside Crypto.com Arena, but by virtue of Apple Immersive, it’s damn near close. Watching live sporting events as someone with a visual disability can suck depending on one’s needs and tolerances, which is why using Apple Immersive to shoot games is so tantalizing. The whole point of Apple Immersive is to make viewers feel like they’re right there even if they’re really not; to use the format to stream sports can elicit those same feelings of… immersion. In an accessibility context, someone like me could feel like I could enjoy the action a lot more.
As Krol notes, Apple has taken a liking to using its products for capturing live sports for streaming. Jason Snell at Six Colors recently reported on iPhone 17 Pro being used to film a recent Tigers-Red Sox game from Boston’s Fenway Park for Friday Night Baseball on Apple TV+. Similarly, I wrote in April about the NBA’s Sacramento Kings using a tactile display to make games accessible to Blind and low vision fans visiting Golden1 Center.
Slide Over’s Return Receives A Hero’s Welcome
Jason Snell at Six Colors wrote this week about a big addition coming to iPadOS 26.1, now in beta: Slide Over. Indeed, one of the tentpole features of the OG multitasking system introduced in iOS 9, Slide Over is soon returning to iPadOS. The functionality was removed in iPadOS 26.0, a decision longtime iPad-loving friends of mine such as Harry McCracken at Fast Company lamented. Maybe it’d return someday, he wished.
Return it shall, according to Snell.
“In iPadOS 26.1 beta 2, Slide Over is now an explicit part of the new multi-window multitasking view,” he reported on Monday. “To enable it, open a window and resize it so that the three ‘stoplight’ buttons appear, tap and hold on the green one, and choose Add to Slide Over. Or choose Move to Left (or Right) Slide Over from the Window menu. Or type option-globe, left or right. All of those will work.”
He continued: “When Slide Over is invoked, the current window will be resized and stuck in the corner. You can grab the top of it and slide it off-screen, and it’ll vanish—only to reappear when you swipe your finger from off the side of the screen back on. You can stick the window on either side, and it’ll hang out there, regardless of whether you’re using full-screen windows or have a bunch of windows. You can even resize the Slide Over window when it’s on screen, and it’ll stay that size—unlike the old implementation.”
Notably, Snell writes Slide Over works only in multi-window mode. Nonetheless, he’s absolutely right when he adds it’s possible to use fullscreen apps in the new windowed mode. “Nobody’s going to force you to make those windows smaller,” Snell said.
Reading Snell’s story got me pondering my own iPad usage. Just over a year ago, I was gifted a 13” M4 iPad Pro (with 1TB storage and cellular) for my birthday. Apropos of the new multitasking capabilities in iPadOS 26, the biggie iPad is ideally suited as a laptop replacement for travel. I’m no longer as bullish on iPadOS-as-productivity as I once was, mainly because I’ve come to prefer macOS for work nowadays. To be clear, this is not so much a philosophical difference—indeed, iPadOS 26 is terrific if grossly overdue and I maintain the iPad is the most accessible computer Apple’s ever created—as it is my personal preference changing over time. Thus, the iPad has been relegated to content consumption duty on the couch. Like my iPhone Pro Max fatigue, the 13” iPad Pro, however thin and light and sporting a gorgeous OLED screen—I’ve discovered the biggie tablet is considerably not conducive to lounging. With the exception of watching movies and TV shows, the 13” model is awkward to hold for extended periods—a sentiment beautifully illustrated when rotating from portrait to landscape orientation.
Given this shift in mentality, I’m excited to hear the M5-powered iPad Pros are purportedly on the way. Presuming the rumors become reality, I’d love to downsize to the 11” iPad Pro so as to better suit my tablet usage. While I’m not ashamed to admit my lack of iPad productivity (I do have the Magic Keyboard as well) is partially due to coping with perpetually living in the throes of severe anxiety and depression—i.e., I don’t touch grass as often as I should—I nonetheless have noticed my penchant for using the iPad as a passive device for relaxation, particularly at nighttime. The truth of the matter is, for this case, the 13” Pro is annoyingly unwieldy all things considered. The 11” iPad Pro gets all the goodness of iPadOS 26—the reimagined multitasking system and everything else—including the much-ballyhooed return of the dearly beloved Slide Over feature.
See also: Don’t miss Federico Viticci’s take for MacStories on Slide Over’s resurrection.
iPhone Air Review: A Thin and light Thrill Ride
The accessibility story of the all-new iPhone Air can be distilled into one word: hubris.
In the weeks ahead of last month’s unveiling, I was steadfastly dubious that Apple’s purported “iPhone Air”—a name that ultimately proved real—would be accessible to use largely because its thin-and-light body would make it too hard to carry and hold. I was sure the Air would be inaccessible because, owing to its name, its hallmark physical traits would mean less tactility to comfortably grip. Hardware accessibility matters, after all, and Apple’s latest exemplar of engineering exemplifies that notion.
As it turns out, I was wrong—kinda. The iPhone Air is, in fact, an extraordinary device.
Although there remains a certain class of people for whom the iPhone Air’s thinness and lightness could be detrimental as a physical object, the $999 phone has, in my week or so of testing, proven to be the most enlightening and enjoyable review period of any iPhone I’ve ever tested—and I’ve reviewed a lot of them over my 12 years as a journalist. Apple sent me an iPhone Air (a black 1TB model)—alongside the regular iPhone 17 and 17 Pro Max; more on those later—and I chose to use the Air. For one thing, I have neither the time nor the bandwidth to test all three phones and then write three bespoke reviews. For my sanity, I have to pick one and ride with it. For another thing, the Air is, for my unique purview, the most interesting of this year’s crop of iPhones. It’s certainly the coolest model, thanks to its svelte design. My time testing the Air has shown me that, while I wasn’t wrong in my inclination per se, the phone’s ballyhooed thinness and lightness actually are its strongest proponents in a disability context.
I won’t bury the lede: The iPhone Air is spectacular and it’s my new daily phone.
How New Hardware Confronts Old Hubris
I’ve been a devout iPhone Max (née Plus) user for over a decade now. In nearly all my past reviews, I’ve pointed out the need for me to make a technological Faustian bargain: to get the biggest, easiest-to-see screen, I must incur the cost of coping with an aircraft carrier in my hands and in my pocket. The better battery life was the proverbial icing on the cake. Every iPhone I used with regularity for the last several years has been a Pro Max of some sort because I truly believed I needed that big screen, ergonomics be damned. In the last year or so, however, I’ve found myself growing weary of my aforementioned deal with the devil. I’ve grown weary of lugging around such a huge object everywhere I go, even if it is just across the house. The Pro Max’s screen, while glorious to behold, fits as gracefully in my hands and pockets as a bull in a china shop.
The iPhone Air, then, strikes me as offering the best of both worlds.
Consider the screen. While smaller than the 6.9” display on my 16 Pro Max, the 6.5” display on the Air is plenty big enough for me. It even does the iPadOS-like trick when, if you rotate the phone into landscape orientation, the UI morphs into a two-column view in apps like Mail and Messages. I mentioned in last year’s review the Pro Max’s screen size is at the edge of my threshold for comfortability; go any larger and Apple risks encroaching “seriously close” to iPad mini territory, I said, akin to Icarus flying too close to the sun. By contrast, the Air’s big display feels perfectly suited for its form.
And what form—it’s the Air’s entire selling point. As I wrote earlier, the Air’s thinness and lightness have proven to be its most endearing attribute. With the exception of my one year sojourn with my beloved blue iPhone XR because it was a blue iPhone, I’ve always chosen the Pro model because (a) I’m a nerd; and (b) I wanted the best cameras. With the iPhone Air, the emotional appeal is, like the XR in 2018, trumping the nerdiness in me. Whenever I pick up the Air, I instantly get an immense feeling of joy and delight and, frankly, boyish wonder—I’m continually awestruck by its design and how thin and light it is. Intellectually, I’m fully aware of the fact the Air is not Apple’s best iPhone. I know the Air lacks the LiDAR sensor needed for the myriad detection modes in the Magnifier app. I know the phone’s single camera system isn’t as good or robust as the Pro’s. But I don’t care—like when I chose the XR over the objectively better XS because I could have a blue phone, I prefer the Air over the Pro for the visceral user experience.
There’s a reason emotion is, psychologically, a key part of advertising: it works.
As a practical matter, I prefer the Air for the obvious reasons of its thinness and lightness. My perspective on the device has done a complete about-face, as I now love how easy it is to hold my phone in my hand and carry in my pocket. On that note, I disagree with my friend Jason Snell’s opinion that to put a case on the Air is to negate the phone’s reason for being. While it’s true a case inevitably does add bulk and weight, I’ve found the increase to be infinitesimal in my testing. For my purposes, I’ve been ardently pro-case on every iPhone I’ve ever used more for ergonomics than protection. My fine-motor skills, including my muscle tone, decidedly lacks luster and, consequently, I’m prone to accidentally dropping things. The last thing I want to drop is my iPhone, let alone a review unit from Apple. What a case does adds friction and a “tackiness” that helps to better secure the phone. A case makes holding my phone more accessible. I understand Snell’s contention that the 17 or 17 Pro are better choices with a case, but I use one anyway. Apple gave me its official iPhone Air case (in Shadow) and I like it. It gives me—yet again—the best of both: protection and better accessibility.
Cursory Notes On Camera And Battery Life
Speaking of accessories, Apple included the $99 iPhone Air Battery Pack in my proverbial box of goodies. Truth be told, I haven’t opened it; battery life on the Air is considerably worse than on my 16 Pro Max—and I do plan on using the Air’s Battery Pack eventually—but it hasn’t been so bad that I’ve felt anxious about charging. As a remote worker, I spent a lot of time at home, which means my phone is usually sitting on the charger on my desk. I have access to a charger in other parts of the house as well, and battery life has never been so constrained so as to be unusable. Where the Air’s battery would be put through the wringer is, for example, when I’m at Apple Park covering WWDC or other event. On those special days, I’d surely bring the Battery Pack with me. Otherwise, the Air’s battery has been fine in the humdrum of my everyday life.
As to the camera, it’s occurred to me during testing that I actually use the ultra-wide on my 16 Pro Max more often than I realize. I admittedly miss it on the Air, but not so much that it’s a dealbreaker that moves me to the 17 or 17 Pro. I’ve been perfectly happy with the image quality of the Air’s “fusion” camera, and as I said previously, care not that it doesn’t have the LiDAR sensor needed for the Magnifier app’s various detection modes.
Cursory Notes On The Other Phones
As I said at the outset, Apple included the 17 (in green) and 17 Pro Max (in orange) along with the black Air. Design-wise, the Pro Max is the complete and utter antithesis of the Air: more industrial and “tool-like” in ways the Air is not. If the Air is a sports car, the Pro Max is a Range Rover. The latter’s relative girth and heft are striking compared to the Air; as such, the Pro Max doesn’t give me the same feelings of giddiness. The Pro Max is more pro than ever before, in form and function. As to the color, the “cosmic orange” is loud and proud. It’s nice, although not to my personal taste. I’d probably pick the blue.
Now, the standard 17 is interesting indeed. Were the Air nonexistent, I’d probably choose the 17 (in blue) as my new phone. Beyond my fatigue over the Pro Max’s size, the standard 17, spec-wise, is damn impressive. And it has the ultra-wide camera. Apple ought to be commended for doing a great job of reaching feature parity across the refreshed iPhone line. I’m increasingly feeling as though I don’t always need the highest-end, tricked-out gadgets. The 17’s value proposition is stratospherically high, and its siren song would be much more seductive were it not for the advent of the Air.
The Bottom Line
Going back to Snell’s own review, he says the iPhone Air is a harbinger of the future—and he’s absolutely right. The iPhone Air is not the best iPhone if we’re talking metrics—although kudos to Apple for putting the A19 Pro chip inside—and it isn’t the best iPhone for most buyers (that’s the 17). But for me, in the present, it’s the best, most accessible iPhone yet because it’s a marvel of engineering—and emotion. Nike has the copyright market cornered on “Air Max,” but that’s what the iPhone Air essentially feels like to me: a slimmer Max. I couldn’t be happier… it truly does offer the best of both worlds.
And I can get it in blue to boot.
Disney Announces Redesigned Disney+ App, More
In a press release on Thursday, Disney announced big changes for Hulu and Disney+.
Hulu, Disney says, is poised to “reach worldwide audiences” beginning next week, on October 8, when it becomes what the company is calling “the global general entertainment brand on Disney+.” Disney notes the changes, in strategy and in user interface design, are anticipatory in nature as the “fully integrated unified app experience” is expected to arrive sometime next year. The news comes after it was announced in August the standalone Hulu app would be retired and folded into Disney+.
Most interesting from an accessibility perspective is the overhauled app design. In the announcement, Disney shares screenshots of the new look alongside detailed explanations of what’s changing. At a high level, the tab-oriented design is conceptually identical to what Netflix did to its app insofar as, like Netflix, Disney is touting a “simpler” and “more intuitive” experience, replete with splashier visuals and personalized recommendations powered by “an updated algorithm that learns user preferences over time.” Aesthetically, I think the new design is a real looker; in terms of usability, I like this trend of using a top-anchored tab bar for accessibility. Especially for cognition—which arguably matters most of all when it comes to navigating streaming apps—it can be immensely helpful for many people with intellectual disabilities to know that to find stuff, it starts at the top. Likewise, the heightened emphasis on nicer poster art and other visuals can make it more accessible for someone to identify, for example, The Simpsons, by seeing a big picture of Homer Simpson’s face in the menu. That the Disney+ app is moving to a more tab-focused design is especially important because, being the media conglomerate it is, Disney owns (and thus wants to promote) content from properties like Marvel, ESPN, Star Wars, and of course, the aforementioned Hulu.
“The new design is more modern and intuitive so users can find and discover the characters and stories they love,” Disney wrote of the forthcoming redesign of the Disney+ app. “This includes a new video display in the Hero carousel and a more dynamic brand row, showcasing the latest titles from each brand. We’ve also updated our content sets to showcase more cinematic poster-style artwork.”
I’ve been enjoying Netflix’s new app. It’s pretty and is more user-friendly to me.
Elsewhere, Disney mentions widgets are set to launch on iOS—by contrast, the redesign is more focused on tvOS, for instance—so as to “take users directly into our programming with one click.” Moreover, the company teased the enhancements are “just the beginning,” saying “additional updates [are] planned” in the run-up to the official release of the ballyhooed “unified app experience” coming sometime in 2026.
Colorado Spotlights ‘inclusive tourism experiences’
Last month, the Colorado Office of Economic Development & International Trade (OEDIT), issued a press release in which the state agency announced what it called “a curated, multi-day destination tour and retreat dedicated entirely to accessible travel.”
The event was organized alongside accessible travel company Wheel the World.
According to the OEDIT, the event, held September 9–12 in Denver, “brought together more than 20 accessibility advocates, influencers, journalists and Colorado tourism leaders for a series of adaptive outdoor adventures, culinary and cultural experiences and meaningful conversations about accessibility in travel,” adding “set against the backdrop of The Mile High City, the multi-day experience showcased Colorado as a destination setting the national standard for inclusive travel” with kayaking and more.
“The Polis-Primavera administration is committed to building a Colorado for All,” Lt. Governor Dianne Primavera said in a statement included in the press release. “Colorado is proud to lead the way in making travel inclusive so that everyone, regardless of ability, can experience the beauty, adventure, and culture our state has to offer. This gathering shows what’s possible when we commit to breaking down barriers and ensuring that travel is truly for all.”
The aforementioned “Polis” refers to Jared Polis (D), Colorado’s governor.
Colorado, which boasts it’s “the best state in the nation for people with disabilities” according to the AAA State of Play 2025 Report, said the recent event builds on efforts by the Tourism Office and Wheel the World to “promote and expand accessible travel in the state, including the Accessible Travel Program launched in August 2024.” The program is described as “[seeking] to enhance accessibility in key tourist destinations, ensuring that Colorado continues to be a welcoming place for all travelers.”
“This gathering was about more than exploring Colorado’s natural beauty and engaging culture,” said Alvaro Silberstein, co-founder and chief executive officer of Wheel the World, in his own statement for the release. “It was about connection and progress. The Colorado Tourism Office is confidently showing what’s possible when destinations lead with accessibility and prioritize the experiences of travelers with disabilities.”
As of this writing, Colorado is one state I’ve yet to visit despite some of the closest people in my life living there. (In fact, I have a note in Apple Notes tallying the states I’ve visited; I’m up to 9 thus far.) As I’ve written several times in my extensive coverage of Airbnb’s accessibility efforts, travel and technology indeed have relation. In this case, that Colorado has invested considerable resources into making its state accessible to visitors (and residents) involves technology in the adaptations and, obviously, in the dissemination of the news—the latter of which is crucial because it’s highly likely the majority of people are unaware about ostensibly obscure inclusive travel initiatives.
Colorado’s accessible travel news was preceded by California’s own announcement in early August that it launched its first-ever Accessibility Hub. The state described the website as “a comprehensive online resource designed to empower travelers with disabilities to explore the Golden State with greater ease and confidence [and an] initiative [supporting] both travelers and industry professionals with tools, tips and curated content aimed at making California more inclusive and navigable for all.”
OpenAI Releases New ‘Sora’ Video creation App
Ryan Christoffel reports for 9to5 Mac OpenAI, maker of ChatGPT, has a new app in the iOS App Store: Sora. The app, as Christoffel describes it, uses “AI for video creation.”
According to Christoffel, Sora’s description reads in part the software is meant to “turn your ideas into videos and drop yourself into the action,” adding the app has been built with the purpose of being “a new kind of creative app that turns text prompts and images into hyperreal videos with sound using the latest advancements from OpenAI.”
Christoffel notes Sora represents “the first major OpenAI launch” since its latest model, GPT–5, was released to much fanfare last month. Based on some reports I’ve seen online from intrepidly nerdy early adopters, access to Sora currently is restricted to a waitlist. I haven’t yet downloaded it, but I‘ll likely be putting my name on the list soon.
From an accessibility perspective, Sora’s utility lies in a sentence in the aforementioned app description: “A single sentence can unfold into a cinematic scene, an anime short, or remix of a friend’s video.” Of course, this methodology follows the lead of OpenAI’s canonical chatbot; to wit, writing a text prompt will prompt the AI to create based on one’s instructions. Although most people are familiar with simple prompts or queries for interacting with ChatGPT—or Gemini or Claude, for that matter—the reality is, the modality is genius for accessibility. In Sora’s case, that the user is supposed to give the app a short descriptor for what to do means video creation—ostensibly reasonably complex and involved depending on creative intent, tools, etc—suddenly becomes far more accessible because AI is assuming the load and doing the grunt work. As I’ve written many times, Sora very plausibly could do for video creation what, say, ChatGPT does for web searches and other research for school essays and whatnot. Again, the salient point is, used in this manner, AI is a bonafide enabling technology that breaks down barriers to creative processes that one otherwise would be excluded from for myriad reasons. This is neither trivial nor esoteric—and it sure as hell isn’t “lazy.” I wrote about similar concepts in June 2024 when I interviewed Kantrell Betancourt about her then-new book on using Midjourney and seeing AI writ large as an assistive technology.
gemini’s arrival on google tV Portends a more accessible TV Experience for All Viewers
Google last week announced Gemini is coming to Google TV as one’s “conversational assistant to help you find content to watch on the big screen.” The new feature was detailed in a blog post written by Shalini GovilPai, Google’s vice president of Google TV.
“The TV is the heart of the home—the place where we gather, cheer and connect. For years, Google TV has made it easy to find great entertainment, and Google Assistant has helped TVs do more just by voice, from getting recommendations to dimming the lights,” GovilPai wrote in the lede. “Today, we’re introducing Gemini for TV. Everything you already do with Google Assistant still works, but Gemini on Google TV goes beyond simple commands and lets you engage in free-flowing conversations with your big screen. Get help finding the perfect show for whatever mood you’re in, brainstorm a family trip or answer complex homework questions. Just say ‘Hey Google’ or press the microphone button on your TV remote to unlock a new world of possibilities.”
Amongst the many examples GovilPai gives for using Gemini on the big screen involves one of my favorite shows in The Pitt on HBO Max. She notes people can ask Gemini questions such as ‘What’s the new hospital drama everyone’s talking about?’ and follow-up with queries like ‘What are the reviews for The Pitt?’ Gemini’s user interface manifests itself as a horizontal search bar at the bottom of Google TV’s home screen.
The news comes after Google TV’s aforementioned home screen received a redesign.
I’ve written copiously about my bullishness over Gemini on my iPhone for accessibility, but it has relevance to televisions too. To GovilPai’s point about surfacing content based on vague, open-ended questions, it could be more accessible for someone with certain cognitive conditions to find shows like The Pitt because they needn’t be so precise in asking for what they want. As the artificially intelligent agent, Gemini is capable of inferring that the “new hospital drama everyone’s talking about” quite certainly could be The Pitt. This type of functionality only enriches the accessibility of the Google experience for things like, for instance, controlling one’s smart home setup, as well as Google TV’s content-centric software design—of which I’m already a big fan for accessibility reasons. All told, Gemini vastly improves Google TV’s value proposition, making it a solid choice for one’s home theater if accessibility and ease of use means more to you than, say, hardware and software performance and ecosystem amenities.
Gemini is now available on TCL’s new high-end QM9K set, and will come to the QM7K, QM8K, and X11K models later this year. In addition, streaming boxes such as Google’s own Google TV Streamer and Walmart’s Onn 4K Pro—which I briefly reviewed back in early August—will receive Gemini in an update “later this year,” according to GovilPai.
Target Unveils ‘first-of-its-kind’ Self-Checkout Stations for Better Accessibility, Autonomy
Target earlier this month announced what it describes as a “first-of-its-kind” in-store innovation: accessible self-checkout kiosks. The new technology will roll out in the United States starting with the holiday season and continue throughout early 2026.
“A father stands beside his daughter at a Target self-checkout. She has low vision, and for the first time, she’s navigating the process on her own. He guides her through each step, offering quiet support as she scans an item. A soft beep sounds, followed by a clear voice reading the total. Her fingers move confidently across the tactile controller, guided by feel and sound rather than sight. The experience feels intuitive and empowering,” Target wrote in its announcement. “Thanks to Target’s new accessible self-checkout, moments like this will soon be possible for more guests across the country. Designed with and for disabled guests and people with disabilities, this solution is the first of its kind in U.S. retail. Rolling out to self-checkout stations nationwide beginning this holiday season and continuing through early 2026, it’s part of Target’s ongoing checkout improvements, reflecting our commitment to creating joyful, guest-first experiences that help all families feel seen, supported and welcome.”
The checkout system features a mix of hardware and software, including Braille and high-contrast icons, a tactile navigation button, and a headphone jack with adjustable volume. Target worked “closely” with the National Federation of the Blind (NFB), of which the company said “provided valuable feedback throughout the development, design and testing process,” adding the feedback “directly shaped the technology.”
Target goes on to reveal the father-daughter duo was a Blind man named Steve D, with her daughter being low vision. Notably, Steve works at Target as a user experience accessibility manager, with the company saying both he and his daughter “have spent years navigating stores that weren’t designed for them” while adding Steve worked on building the new accessible systems. The experience is profound, according to Steve.
“Shopping with my daughter and teaching her how to use the self-checkout, that was powerful,” he said in the post. “It’s not just tech. It’s joy, independence and change.”
The NFB is enthusiastic too.
“Target’s new accessible self-checkout experience is unique not only because it is a first in the industry, but because it was designed through collaboration with the blind, incorporating our technical expertise and lived experience,” Mark Riccobono, NFB’s president, said in a statement for Target’s announcement. “The rollout of this innovation further establishes Target as an industry leader in accessibility and a true partner of the blind in our quest for equal access to all aspects of modern life.”
It’s good to hear Target is investing in accessibility; the new checkout kiosks not only foster inclusion, they also instill independence. As an avid Target shopper myself, I typically go to a line with a human cashier, but do use self-checkout at other retailers. Even with low vision, I manage just fine, but do lament smallish text size, laggy touchscreens, and barcode scanning. To the latter, it oftentimes is difficult to find it on the product; this slows me down and stresses me out—especially if people are waiting.
Uber Introduces ‘Senior Mode’ for older adults
I received a promotional email from Uber late yesterday afternoon alerting me to the Uber app’s all-new “senior mode.” Intrigued out of journalistic curiosity, I go to click the “Learn More” button in the message, which zips me to this page on Uber’s website. The header text on the page promises what Uber is calling “easier rides for older adults.”
“Caring for loved ones is a balancing act—that’s why we’ve made it easier to support your parents or grandparents,” Uber writes. “Set them up with a simpler ride experience so they feel confident going anywhere, knowing you’re there to help if needed.”
Amongst the hallmark features of senior mode include a “simplified app experience” replete with larger buttons and text. In addition, Uber boasts senior mode features “a minimal homescreen to help make booking even more straightforward, and only see essential booking details for added clarity.” Importantly, Uber says senior mode has aid-focused tools for caregivers and/or loved ones in the event someone needs help.
“Easily lend a hand if your loved one needs help—you’ll be able to track trips, add their favorite places, and call their drivers,” Uber says. “Plus, they’ll always have access to our on-trip safety features such as being able to call 911 and our 24/7 safety support.”
Uber has partnered with GIA Longevity and GoGoGrandparent in building senior mode.
While it’s perfectly logical to presume accessibility is about a discrete, admittedly esoteric suite of software features for people with disabilities, older adults—senior citizens—are very much in alignment with accessibility’s target demographic. It makes perfect sense because it’s only natural that people will require more help for daily living as they age. Eyesight gets less sharp. Hearing gets less sound. Fine-motor skills get less precise. Hell, many of our nation’s veterans are older people who became disabled whilst defending the country. Thus, Uber’s senior mode clearly is the company’s recognition of the aging process and the costs it incurs. Uber is making its app more accessible to seniors, the byproduct of which is heightened agency and autonomy.
Similarly, Apple’s Assistive Access on iOS and television products such as JubileeTV and LG’s new Easy TV all are technologies adapted—accessibility is nothing if not about adaptation—to be simpler for seniors (and anyone else with intellectual disabilities).
The advent of Uber’s senior mode comes months after my beloved Waymo introduced a feature—teen accounts—for those folks on the polar opposite end of the age spectrum.
Meta to open Pop-Ups for Ray-Ban Display Glasses
Jay Peters reported for The Verge earlier this week Meta is planning to open pop-up shops in various locales over the next several weeks as a means to enable potential buyers to check out its recently-announced Ray-Ban glasses with a display. The $799 wearable, which Meta announced last week and slated to be released on September 30, is the most ambitious of an ever-expanding line of connected eyewear for the company.
“Meta is just about to launch its impressive smart glasses with a display, and to give more people a chance to try them out and see its other smart glasses and VR hardware, the company is going to open new shops starting in October,” Peters wrote Wednesday. "New Meta Lab pop-up shops will be located in Las Vegas and New York, and and its Los Angeles Meta Lab location, which opened as a pop-up last year, will return as Meta’s flagship store. Meta’s Bay Area shop in Burlingame, California, remains open, too.”
According to Peters, Meta is opening its pop-ups to the general public in Las Vegas, Los Angeles, and New York City. The Vegas location opens at the Wynn Las Vegas on October 16, with the Melrose Avenue location in LA on October 24. The NYC spot on Fifth Avenue, in Midtown Manhattan, is scheduled to open a little later, on November 13.
Anyone interested in the display-equipped Ray-Bans can book demo time with Meta.
I felt covering Peters’ story was apropos given my feature this week on Lucyd’s glasses. Like Apple Vision Pro early last year, I can’t help but feel a slight tinge of FOMO seeing reporter pals such as Bloomberg’s Mark Gurman share initial impressions of Meta’s Ray-Bans with a display; the reason isn’t jealousy, but rather deep curiosity of accessibility. At a high level, Meta’s new Ray-Bans hold more personal appeal than the Lucyd pair I reviewed precisely because the Ray-Bans have a screen. Moreover, while I’m simultaneously not itching to get a pair because I’m not as invested in Meta’s ecosystem as I am Apple’s, the nerdy journalist in me is nevertheless damn intrigued by what Meta has built—particularly its wristband controller thing. Indeed, it’s moments like this—mainstream media types like Gurman getting hands-on (face-on?) time with Meta’s glasses before writing about it—that underscore my ardent belief that disability should have a more prominent place setting at the proverbial table of mainstream media coverage. At the very least, it should be obvious to the powers-that-be who run newsrooms that disabled people pay attention to the news as much as anyone else.
Apple AI Researcher Inadvertently Amplifies Accessibility in Recent Presentation, Blog post
Apple earlier this week pushed a new post to its Machine Learning Research blog in which the company shares highlights of its most recent Apple Workshop on Natural Language and Interactive Systems. Apple describes the get-together, held May 15–16, as “bringing together Apple and members of the academic research community for a two-day event focused on recent advances in NLP [natural language processing].”
Apple’s posts on its Machine Learning Research website are nerdy and technical, and, frankly, academic. What grabbed my attention about this particular piece, however, is Kevin Chen’s presentation on what’s called “Reinforcement Learning for Long-Horizon Interactive LLM Agents.” Marcus Mendes highlighted the event for 9to5 Mac, describing Chen’s talk as “[showcasing] an agent his team trained on a method called Leave-one-out proximal policy optimization, or LOOP.” The agent, Mendes reported, was trained to perform multi-step tasks based on 24 different scenarios. Chen caveated a significant limitation of LOOP presently is it doesn’t support multi-turn user interactions just yet.
According to Mendes, Chen employed the following prompt with the agent: “I went on a trip with friends to Maui recently. I have maintained a note of money I owe to others and others owe me from the trip in simple note. Make private Venmo payments or requests accordingly. In the payments/requests, add a note [called] ‘For Maui trip.’”
Chen’s ask is the “nut graf” in terms of accessibility. To wit, it’s highly plausible a disabled person who needs to Venmo their friends cash for a trip may find manually paying each person individually, along with appending Chen’s note, inaccessible for a variety of reasons—reasons which, as I’m often inclined to point out, transcends sheer convenience. Depending on one’s disabilities—cognitive/visual/motor or some combination thereof—it’s easy to see how paying, say, more than one or two people could be tedious. Sure, you could copy-and-paste the note for expediency’s sake, but the fact remains having to manually pay people means traversing the Venmo app far and wide. Even if it is doable, ability-wise, feasibility doesn’t equal ease of use. Ergo, Chen’s reliance upon AI to do the grunt work for him makes paying people back not merely the conscientious, responsible thing to do—it’s infinitely more accessible too.
Again, Chen is explicit in his disclaiming the LOOP technology is imperfect and needs more massaging. Nonetheless, it’s extremely heartening (and downright exciting) to see how AI’s application in this manner has profound potential to make life so much more accessible for those (like yours truly) who are part of the disability community.