Amazon Makes Returns more Accessible
Ryan Christoffel at 9to5 Mac wrote late last week about a notable change to Amazonâs iOS app: return codes can be added to Apple Wallet. As someone who returns things to Amazon fairly often, I feel itâs a subtle yet impactful change for greater accessibility.
âAmazon rolled out support for âAdd to Apple Walletâ buttons inside its iOS app over the last few weeks. As part of the return process, instead of the standard in-app return code for drop-off locations to scan, you can now opt to save that code to Apple Wallet,â Christoffel said of Amazonâs recent update. âHaving the Amazon return code inside Wallet is nice because it shows all the details youâll need, plus ensures you can pull up the code at just the right time without being reliant on a solid cellular connection.â
I havenât noticed the âAdd to Walletâ buttons, but Iâm going to look out for them now.
My process for obtaining Amazonâs QR codes for returns typically involves three steps:
Search Apple Mail for the return email
Tap the link in said message
Keep my phone open to Amazon so as to keep the code handy at Whole Foods
As you probably can surmise, these steps donât exactly comprise the most streamlined, accessible workflow. (It also doesnât help search in the stock Mail app is pretty bad.) By contrast, having the ability to save the Amazon codes is a de-facto accessibility feature insofar as the Wallet app is available when my phone is locked. It removes the need for my 3-step journey to find the needed information and also traverse the equally bad user interface of the Amazon app. Beyond return codes, Christoffel also notes iOS 26 gives the Wallet app the capability to track Amazon orders. A nice addition, to be sure, but Iâll be sticking with using the widget from my preferred package-tracking app in Parcel.
The Tantalizing Tale of the âTensor Robocarâ
Andrew J. Hawkins wrote for The Verge this week about a mysterious Bay Area-based autonomous vehicle company called Tensor. The startup, which is headquartered in San Jose, is a self-described âleading agentic AI companyâ that this week announced what it hails as âthe worldâs first personally owned autonomous vehicleâ in its Tensor Robocar. The companyâs conceit is apt in an accessibility context, as Tensor said its overarching goal is to â[empower] individuals to truly own your autonomy.â
âWhen the world shifts⊠how will you move?â Amy Luca, Tensorâs chief marketing officer, said in a statement included in the companyâs press release. âWe are building a world where individuals own their personal AGI agents, enhancing freedom, privacy and autonomy. With Tensor, weâre introducing the worldâs first personal Robocar, ushering in the era of AI defined vehicles. This isnât a car as we know it. Itâs an embodied personal agent that moves you. Itâs time to own your autonomy.â
According to Hawkins, Tensor is affiliated with autonomous vehicle maker AutoX, which operates both in China and here in the United States. Moreover, Hawkins notes Tensor claims to have offices in Barcelona, Dubai, and Singapore. AutoX has been testing autonomous vehicles in San Jose and the surrounding area since 2016. Tensor is looking to launch in America, Europe, and the Middle East starting sometime next year.
After his story ran, Tensor spokesperson Lena Allen sent Hawkins a statement.
âSince its founding in 2016 in San Jose, AutoX has been an American company. Its new consumer brand, Tensor, is also headquartered in San Jose, California, with satellite offices in Spain, the UAE, and Singapore. Tensor focuses on the US, EU, and the GCC markets. As an independent private California startup, theyâre controlled by their U.S. employees, with significant majority investment/ownership from the UK, Japan, Korea and US. In 2018, Auto X launched their autonomous delivery service in San Jose for over 1000 self-driving delivery orders, operating until Covid lockdown. In 2020, AutoX received the second-ever driverless AV testing permit in California,â Allen said in response. âIn 2019, AutoX entered the China market as a foreign company in China. We managed to obtain local self-driving test permits alongside with other foreign companies, such as BMW, Tesla, and VW. During the pandemic lockdowns in the U.S., we launched a fully driverless robotaxi fleet in China. However, starting several years ago, AutoX began winding down its China operations; all operations under the AutoX brand in China have been divested, with all offices closed and operations shut down.â
She added: âThe AutoX brand and its China operations have been fully discontinued. We have evolved into the Tensor brand to better reflect our renewed focus on delivering personalized, private, and autonomous technology for individual ownership.â
Hawkinsâ report caught my attention because of what Tensor is seemingly trying to do: extend ownership of autonomous vehicles to Average Janes and Joes. This is tantalizing in an accessibility sense because, as Iâve argued in the past, autonomous vehicles represent both accessibilityâs apex and artificial intelligenceâs profound powers. More pointedly, however great services like Waymo, et al, are today, a tomorrow in which a Blind and low vision person (like yours truly) could actually purchase a autonomous vehicle from, say, Tensor, would be literally life-changing. Granted, Waymo operates 24/7, but it isnât available everywhere just yet; to have my own self-driving car would mean my transportation options wouldnât rest on the mercies of availability because I could just get in my car and go wherever I wanted, whenever. The reason autonomous vehicles reflect accessibilityâs zenith is because the technology empowers disabled people who are precluded from driving regular cars with greater feelings of agency and autonomy. It instills grander feelings of self-esteem and self-worth by giving us the independence so many of us crave in a society where the disability community is more often than not looked down upon with patronizing, paternalistic, and infantilizing attitudes. I canât sum it up any better than Lana Nieves, executive director of San Franciscoâs Independent Living Resource Center, who told me in 2023 sheâs bullish on driverless cars because, as an adult, âwhy shouldnât I be able to go where I want to go?â
Of course, all is not rosy in this situation. Indeed, there will come a reckoning sooner or later involving legislation, costs, and the notion that people like Nieves and myself should be able to âdriveâ cars if we canât see. Nonetheless, itâs heartening to notice fledgling companies like Tensor acknowledge the value of people actually owning the robots-on-wheels they ride in. It gives me hope for a much brighter future in this space.
Blood Oxygen Sensor Returns To U.S. Apple Watches
Apple on Thursday posted an update to its Newsroom site wherein it provides an update on the blood oxygen sensor in Apple Watch. The company says the functionality has returned for users in the United States, adding software updatesâiOS 18.6.1 and watchOS 11.6.1, released todayâreenables the dormant feature. The workaround comes amidst Appleâs ongoing litigation with a company named Masimo over Appleâs supposed patent infringement regarding the aforementioned pulse oximetry sensor.
âApple will introduce a redesigned Blood Oxygen feature for some Apple Watch Series 9, Series 10, and Apple Watch Ultra 2 users through an iPhone and Apple Watch software update coming later today,â Apple said in its short announcement shared today. âUsers with these models in the U.S. who currently do not have the Blood Oxygen feature will have access to the redesigned Blood Oxygen feature by updating their paired iPhone to iOS 18.6.1 and their Apple Watch to watchOS 11.6.1. Following this update, sensor data from the Blood Oxygen app on Apple Watch will be measured and calculated on the paired iPhone, and results can be viewed in the Respiratory section of the Health app. This update was enabled by a recent U.S. Customs ruling.â
Apple emphasizes the update pertains only to American Apple Watches. It does not affect âunits previously purchased that include the original Blood Oxygen feature, nor to Apple Watch units purchased outside of the [United States],â the company said.
John Gruber has posted a good piece on the situation. Notably, he reports a source at Apple said todayâs fix is known as âHQ H351038â but is ânot yet publicly availableâ on the Customs and Border Protectionsâ Customs Rulings Online Search System website.
From an accessibility perspective, the restoration of the Apple Watchâs blood oxygen sensor is notable as monitoring oneâs blood oxygen saturation is key for myriad respiratory issues. Indeed, lung conditions like asthma and pneumonia have the potential to lower blood oxygen levels, as do sleep-conditions such as sleep apnea. Coincidentally, Apple added sleep apnea tracking to watchOS 11 last September. The Health app on iPhone received a new metric the company calls âBreathing Disturbancesâ and users can track how elevated (or not) their breathing is during the night. The sleep apnea tracking is available on Apple Watch Series 9, Apple Watch Series 10 (which I have but never officially reviewed, alas), and Apple Watch Ultra 2.
Drive Program Alum Talks Experiencing the program, learning to drive, More In Interview
Last month, I posted a story featuring an interview with Dr. Christina Potter. An academic researcher and experimental psychologist by training, Dr. Potter works as coordinator of the Drive Program run by Miami-based Nicklaus Childrenâs Hospital. Established in 2023, the Drive Program exists to âprepare neurodiverse individuals for a driving examâ using a virtual reality headset. According to Dr. Potter, Nicklaus Childrenâs manager of IT and digital technologies, explained to me in part the âsimple but powerfulâ impetus for the Drive Program was to âhelp young people, especially those who face challenges like autism, anxiety, or ADHD, to gain the confidence and skills they need to become safe and independent drivers.â The core problem the Drive Program sought to solve, she added, was conventional driving schools arenât conducive to the needs of neurodivergent people, saying the schools âdonât offer the flexibility or patience or support that [neurodivergent people] really need to succeed.â
âWe saw an opportunity to fill that gap in a way that aligned with with our mission at Nicklaus Childrenâs,â Dr. Potter said.
Fast-forward to this past week, I sat down for a brief interview with a young woman named Anna Mariani. Mariani, 24, is an alumnus of the Drive Program, having went through it herself a few years ago. When asked about her experiences being in the Drive Program, she explained the one thing she appreciated most about it was its slow pace; âI could do things in my own time⊠it was well-explained, all the things that you needed to do while driving and paying attention to all the things [on the road],â Mariani said.
âIt was good for me to practice being in the car,â she added.
Mariani said she first learned of the Drive Program through CARD, or the Center for Autism and Related Disabilities, managed by the University of Miami. She described CARD as a program which offers services to people in the neurodivergent community, telling me it was they who recommended the Drive Program âto learn how to drive.â
Mariani doubled down on her effusive praise heaped onto the Drive Program.
âI think [the Drive Program] helps because the teachers and instructors were really patient with me⊠I was able to be coached into what I needed to do,â she said. âAlso, the virtual reality aspect was really good because it helped me feel like I was actually in the car. So when I got in the [actual] car, it felt more natural and that helped me feel more confident when I was actually driving the car.â
For Mariani, the Drive Program helped her best prepare for driving independently.
âWith practice, it feels a lot more comfortable,â she said. âAt first it was little scary, but then I started doing it more, and now Iâm more comfortable driving in a car in real life.â
Mariani went on to say she highly recommends the Drive Program to everyone who may benefit from it, adding the Programâs staffers were invested in helping her learn and well-trained. The Program overall, she added, is âreally advanced.â Mariani noted she has encouraged a friend to enroll in the Drive Program and hopes they do so âsoon.â
âI definitely think other people should give it a try if theyâre nervous or they donât know where to start when it comes to driving,â Mariani said in endorsing the Drive Program. âI think this is a good place to start [helping] others feel more comfortable.â
Airbnb Announces âReserve Now, pay Laterâ Service
San Francisco-based Airbnb on Thursday announced a new payment program it calls âReserve Now, Pay Laterâ whereby users can defer payments for upcoming reservations. The company says Reserve Now, Pay Later affords guests âgreater flexibilityâ by allowing them to put $0 money down upfront on all domestic bookings.
I learned of Airbnbâs initiative in a post on X by my friend Natalie Lung of Bloomberg.
âAvailable for listings with a moderate or flexible cancellation policy, guests donât need to pay the full amount until shortly before the end of the listingâs free cancellation period. Cancellation policies selected by hosts remain unchanged, and because the payment from guests is always due before the free cancellation period ends, hosts have time to secure another booking even if a guest cancels,â Airbnb wrote in describing Reserve Now, Pay Later in its announcement. âThis feature comes as new data reveals that todayâs travelers are seeking more flexibility when it comes to booking a stay, particularly a group trip that requires arranging funds with friends or family.â
Notably, Airbnb mentions results of a survey of American travelers it conducted with Focaldata. Airbnb said 55% of respondents indicated they take advantage of flexible payment options, while 10% reported always opting for such services. Similarly, 42% said they have chosen to â[delay and miss out] on their preferred accommodation option because of time spent coordinating how to pay for their trip with co-travelers.â
Like with laptops, the foundational piece of this news from Airbnb is accessibility. Iâve covered the company extensively over the last five years or so, having interviewed numerous executives there, and the reality is the new Reserve Now, Pay Later service is yet another part of Airbnbâs work in accessibility. Granted, it isnât expressly or overtly designed for the disability communityâs sake. The truth is, however, as with Walmartâs discounted $600 M1 MacBook Air I wrote about yesterday, most disabled people are extremely, perpetually budget-conscious. The majority of us donât make much money, so anything we can do to save a few bucks here and there is appreciated in both peace of mind and by our pocketbook. In Airbnbâs case, that a disabled person could delay payment on a reservation makes it such that travel becomes far more accessible than aspirational. Better still, people with disabilities can utilize the accessibility features Airbnb has empowered its hosts to offer guests. Although Airbnb positions Reserve Now, Pay Later as a measure of convenience for the mainstream, the fact of the matter remains accessibility, as ever, plays a central role in shaping its relevance and appeal.
Walmart Makes M1 MacBook Air More Accessible
Joe Rossignol reports today for MacRumors Walmart has begun selling the dearly beloved M1 MacBook Air for the low price (for MacBooks) of $599. The deal is for the laptopâs base configuration of 8GB RAM and a 256GB SSD in gold, silver, or space gray.
âIn case you missed it, Walmart is currently offering the older but still very capable MacBook Air with the M1 chip for just $599 in the United States,â Rossignol wrote of the deal on Wednesday. âIt seems like this deal began around Amazonâs four-day Prime Day event in early July, but it flew under our radar until a reader let us know about it today.â
As Rossignol notes, Apple discontinued the M1 Air last year when it added the then-new M3 models. Walmart announced it would carry the M1 Air (at $699) back in March 2024.
My reasoning for covering this news is, as ever, accessibilityâquite literally. As Rossignol also notes, although the M1 chip is getting long in the tooth by technological standardsâthe M5 generation of Apple silicon is said to be on its wayâthe chip remains more than serviceable for everyday tasks like email, web browsing, word processing, and even light photo editing. From an accessibility standpoint, the value proposition of Walmartâs $600 MacBook Air is stratospheric; budget-conscious buyers, a lot which includes most people with disabilities, get a modern, eminently capable computer thatâs small and lightweight to boot. For those who canât afford the current (and admittedly better) $999 M4 Air, the M1 variety is, again, a veritable steal for hundreds of dollars less. Eventually, assuredly sooner than later, Appleâs M1 chip will be outmoded and obsoleteâbut that day assuredly is years away. Right now, today, the âlow endâ M1 MacBook Air could cogently be argued is Appleâs most accessible Mac, and in more ways than one. In other words, for those who prefer macOS to the Mac-like iPadOS 26âmore on that from me soonâthe inexpensive M1 MacBook Air a revelation.
News of the $600 Air comes amid rumors Apple is preparing a ârealâ low-cost MacBook powered by the A18 Pro chip thatâs sitting inside my iPhone 16 Pro Max. The device is purported to come out either late this year or early next, according to multiple sources.
The M1 MacBook Air is available on Walmartâs website.
Redesigned Netflix App Rolling out to Apple TV
Ryan Christoffel reports for 9to5 Mac today Netflix has begun rolling out its redesigned app to Apple TV 4K users. The news comes months after the Bay Area-based company announced the design overhaul in May, during which chief product officer Eunice Kim described the new Netflix experience as âstill the one you know and loveâjust better.â
âAs spotted by users on Reddit, the new design seems to have launched with the latest tvOS app update,â Christoffel wrote on Wednesday. âIf youâre not seeing it yet, make sure youâre running the latest version of the Netflix app.â
I got the design on the 2021 A12-powered Apple TV (running tvOS 18.6) in my office.
I covered news of the new UI when Netflix announced it, having attended a virtual briefing with the company a few days beforehand. As I wrote at the time, the design looks goodâthereâs a video on YouTube about itâand should prove to be more accessible than the old interface. I wonât rehash my thoughts on it here, but suffice to say feels like a win for accessibility; in the couple minutes I spent noodling around the new app prior to writing this story, I enjoyed it very much. At the very least, itâs a much prettier design than what I literally used yesterday. As I also said in the spring, Netflixâs new design is conceptually exactly akin to what Amazon did to Prime Video a year ago.
AirPods Reportedly Getting Live Translation Gesture
Marcus Mendes reports for 9to5 Mac this week a bit of new UI spotted in iOS 26 Beta 6, which was released to developers on Monday, suggests Apple is planning to enable real-time translation of live conversations on AirPods. The finding comes after the company announced live translations for FaceTime calls and more at WWDC in June.
âIn todayâs iOS 26 developer beta 6, we spotted a new system asset that appears to depict a gesture triggered by pressing both AirPods stems at once,â Mendes wrote of the new finding. âThe image displays text in English, Portuguese, French, and German, and it is associated with the Translate app. For now, we can confirm itâs associated specifically with the AirPods Pro (2nd generation) and AirPods (4th generation).â
Mendes (rightly) notes using AirPods for translative purposes is âright up the wearable wheelhouseâ for products like AirPods and Metaâs Ray-Bans. Indeed, from an accessibility standpoint, using earbuds (or glasses) for translation can be not only more discreet in appearance, but also more accessible in terms of not having to look at, say, Appleâs built-in Translate app while holding it. Such a dance may be hard, if not outright impossible, for those with suboptimal hand-eye coordination. Likewise, itâs highly plausible things like languages are more intelligible for people who are auditory learners or perhaps are neurodivergent. Whatever the case, Mendes is, again, exactly right to posit using wearables for translation is a perfect use case for the technology. Moreover, Mendes is also reasonable in his speculation this feature may have been kept under wraps because Apple plans to make it part of the iPhone 17 software story.
On a related topic, that AirPods are purported to gain a new gesture serves as a good reminder to give a brief shoutout to another AirPods gesture: the head gestures for accepting or declining calls. Much to my chagrin, I get a ton of spam calls every day, which I normally ignore and let go to voicemail. When Iâm wearing my AirPods, however, the aforementioned head gestures as a de-facto accessibility feature; instead of reaching for my phone to tap a button, I can merely shake my head to send those spam calls away. To use natural, nigh universally understood methods of nonverbal communication in this manner is geniusâand itâs accessible too. Rather than search the abyss of my pocket(s) to hurriedly find my phone and take action on an incoming call, I easily can nod or shake my head as necessary. Itâs undoubtedly convenient, as well as technically cool, but itâs also accessibility. Using head gestures to decide on phone calls alleviates a helluva lot of friction associated with using my phone for that.
Yet one more reason to choose AirPods over something like my Beats Studio Buds+.
Google Gives Gemini New âGuided Learningâ Mode
Not to be outdone by OpenAI and ChatGPT, Google has given Gemini a new âGuided Learningâ mode. The news came earlier this week from Jay Peters at The Verge.
CEO Sundar Pichai detailed Guided Learning in a post for Googleâs Keyword blog.
âAnswers from the Guided Learning mode can include things like images, videos, and interactive quizzes,â Peters said in his story. âThe company worked with students, educators, researchers, and learning experts to ensure the mode is âhelpful for understanding new concepts and is backed by learning science,â according to Pichai.
Googleâs conceit with Guided Learning is similar to OpenAIâs insofar as the goal is to not give answers to students as though Gemini were a highfalutin answer key. Rather, Petersâ dek says the goal is much more pedagogical: Guided Learning aims to â[help] you work through problemsâ instead of unhelpfully give them the answers. From an accessibility perspective, the conceit between Geminiâs Guided Learning and ChatGPTâs Study Mode is the same in that both can be counted on to present information in a single space. This can be helpful for people with various cognitive disabilities where keeping track of myriad aids such as flashcards can, somewhat counterintuitively, become problematic. Chatbots can coalesce lots of information.
Once more I say, chatbots are more useful than merely being conduits for cheating by disengaged students. Study-oriented features can make learning more accessible.
Guided Learning comes amid OpenAIâs high-profile announcement of its newest model, called GPTâ5. CEO Sam Altman described it as âthe smartest model weâve ever done.â
âOde to the EarPodsâ
Basic Apple Guy, purveyor of well-made wallpaper, likes Appleâs wired earphones.
âDonât get me wrong, I am still very much on team AirPods, but I have increasingly found use cases and situations where having a pair of good olde wired EarPods has proven quite useful,â he wrote in a new blog post. âThey donât need charging, they work with just about anything, and theyâve quietly aged into a little slice of tech nostalgia.â
Prior to 2016, when AirPods debuted alongside the iPhone 7 and 7 Plus, I spent many, many years using various incarnations of Apple-branded earphones. From the 30-pin iPod connector to the 3.5mm jack to Lightning, Iâve used them all across various iPods and iPhones. A major reason I found AirPods so revelatory almost a decade ago (!) lies in their cord-free nature. However long I used Appleâs cabled earphones, the biggest frustration with them accessibility-wise, was untangling the cord. The image of a ratâs nest cable Basic Apple Guy included in his piece is scary enough for Halloween; I tried so hard to keep the cord untangled, mostly without success. My hand-eye coordination is bad enough that Iâd spend what felt like eons trying to untangle the cable, a task which always involved the most colorful expletives known to humankind. Thus, the advent of AirPods freed me from such torture. From a practical perspective, I also agree with Basic Apple Guyâs fondness for the EarPodsâ remote. While the gestures/stem control on AirPods is fine, Iâve never particularly enjoyed the sensation of pressing or swiping close to my ear. I tolerate it, but itâs sensory input that doesnât at all feel good.
Between my slew of AirPods in my office, all of which span myriad generations and surnames, and my Beats Studio Buds+, Iâve surely no shortage of wireless earbuds to use; if something happens to one pair, I easily can reach for a backup. For travel purposes, however, Iâve made a point to have a set of the $19 USB-C EarPods as a emergency earphone solution in case my AirPods die or, worse, get lost or stolen.
The EarPods are an inexpensive safety blanketâa great addition to my tech travel kit.
Apple Enhances AirPods Charging Case Interface to âmore clearly indicate charging statusâ to users
Apple has yet another de-facto accessibility feature coming in iOS 26.
Ryan Christoffel reports for 9to5 Mac this week the charging case for AirPods has been enhanced so as to more clearly signify its charging status. Christoffel notes a user on X, Minimal Nerd, posted a screenshot (embedded below) of a system card explaining the color codes and their meanings. The UI reads the various lights ânow more clearly indicates charging status.â Green means charged, yellow means in-progress charging, and orange means the case itself needs juice, according to the new user interface.
The change is apparently new to iOS 26 Beta 5, which Apple shipped earlier this week.
âThe differences between yellow and orange are especially subtle, making it unclear whether users will be able to distinguish them,â Christoffel said of the UI. âCurrently, Appleâs support document only notes green and amber as the indicator colors.â
Christoffel added Juli Clover at MacRumors reported there exists code in iOS 26 which has the system notify users âwhen itâs time to charge.â Clover also noted how, in prior iOS 26 betas, Apple sent iPhone notifications when oneâs AirPods needed charging.
This new color-code system should make understanding charging status more accessible. To wit, it can be hard to decipher which color means whatâparticularly with ones like amber and yellow being so similar, as Christoffel rightly noted. The new system makes it much clearer. Additionally, itâs helpful to know oneâs AirPods are being charged by listening for the little tone when putting it on a wireless charging mat. Not only do you hear the chime, but you see the colored light appear with it. That bimodal sensory input can be important insofar as it comforts someone that they placed their AirPods in the right spot for charging. Unlike modern iPhones, AirPods donât support MagSafe; this means a disabled person who, like yours truly, has lackluster hand-eye coordination potentially could miss the âspotâ when trying to set the earbuds down to charge. Without the chime and/or light, you may think your AirPods are charging when, in actuality, theyâre dying because you missed the spot by a quarter-inch or whatever.
As I said, these color codes (and the chime) are de-facto accessibility features.
A mini Review of Walmartâs Onn 4K Pro
I missed it when news broke, but Ben Schoon at 9to5 Google reported in early June Walmartâs Onn 4K Pro streaming box was updated to run Android 14. The update, inscrutably named URO1.250103.029.A1, also brought the April 2025 security patch.
âUsers shouldnât expect any major changes from this update, though,â Schoon said in describing the June software upgrade. âAndroid 14 for TVs was mainly focused on TV sets, but it should make everything feel a bit more snappy.â
As Schoon wrote, the Onn 4K Pro is a seriously great deal; as of this writing, its price is only $45, down from the usual $50. I bought one several months ago to try out of curiosity and came away very impressed. The device runs stock Google TV, offering no Walmart-branded apps or the like. The remote, while plastic, feels nice in hand and its buttons are nice to press and responsive. And the box can act as a smart speaker when youâre not watching anything. Performance-wise, the Onn 4K Pro is serviceable and does the job. As a devout Apple TV 4K user, however, Walmartâs box canât hold a candle to Appleâs in terms of sheer power and overall fidelity. I oftentimes joke Apple TV is laughably over-engineered for its primary purposeâstreaming videoâbut I really appreciate how performant it is when testing the competition. Say what you will about tempering expectations between a $50 box and a $130 box, but the user experience in navigating the menus, et al, are demonstrably and undeniably better on tvOS. More pertinently for my reporting, tvOS smokes Google TV in accessibility features too.
Where the Onn 4K Pro pulls ahead is in Google TV. While I generally prefer tvOS for its niceness and the amenities pertaining to the Apple ecosystem, I do have a soft spot in my heart for how Google TV makes finding stuff to watch easierâand arguably more accessible. Beyond getting the Liquid Glass treatment, tvOS 26 brings little improvement in the mechanics of the user interface. I maintain that, on screens as big as televisions, tvOS has the potential to be so much more than a static grid of icons. That the Apple TV app is a container for things you watch is backwards; the app should be the whole UI, just as on Google TV. Likewise, tvOS should integrate a live TV guide too. Every year, I hope Apple will finally give tvOS its overdue âiOS 7 momentâ and do a top-to-bottom overhaul of the platform, but am always left disappointed. Iâm critical because, frankly, I greatly prefer Apple design to Googleâs, functionality be damned. For all its warts, tvOS simply feels nicer to use than Google TV. But, as I said, that doesnât take away from my admiration of all Google has implemented into the system for users.
As one prime example, the YouTube TV integration is killer if youâre a subscriber.
I heartily recommend the Onn 4K Pro over Googleâs own box if you wanna wade into Google TVâs waters. Again, I was pleasantly surprised (and delighted) by Walmartâs box.
My pal Jason Snell wrote up a comparison of streaming boxes back in March.
Iâm filing this Under âI Learn something every dayâ
This postâs headline says it all.
Last October, I interviewed the developers behind Croissant for iOS and macOS. I canât describe the appâs functionality better than its website does; it says Croissant is âa buttery smooth app for cross-posting to Bluesky, Mastodon, and Threads.â The two-person team of Ben McCarthy and Aaron Vegh told me, in part, Croissant wasnât expressly built for accessibilityâs sake, but nonetheless emanates as a byproduct. The duoâs overarching goal with Croissant was to make a piece of software which was âsomething simple and streamlined,â according to McCarthy. As I wrote, Croissantâs appeal in an accessibility context is that a disabled person neednât have to manually post the same thing to multiple services. Although copy-and-paste is a workaround, but still involves extra tapsâactions which can be taxing to many people out there who cope with any sort of cognitive/visual/motor conditions (or some combination thereof).
Thus, Croissantâs streamlining is accessibility too.
Anyway, one of my previous gripes about Croissant was there existed no button one could push to automatically generate image descriptions, or alt-text, for images. Lo and behold, I went to use Croissant on my iPhone earlier today and noticed a small button that does just that! I asked McCarthy about it on Mastodon and they replied by saying the text-generation feature isnât new and, in fact, has âbeen there for a while.â I incorrectly presumed the feature used AI, but it doesnât; McCarthy told me it works by way of Appleâs âVNRecognizeTextRequestâ API, a tool which Apple describes to developers as âan image-analysis request that finds and recognizes text in an image.â
The moral here? Croissantâs accessibility game is even stronger. Go download it.
Controlling a computer with your Mind is possible
In the run-up to Global Accessibility Awareness Day in May, I reported on Appleâs yearly preview of the new accessibility features coming to its panoply of platforms later this year. Features like Magnifier for Mac, Accessibility Nutrition Labels, Name Recognition, and more all are now confirmed to be in Appleâs âOS 26â updates, currently in beta.
At the end of my aforementioned story, I mentioned the timing of its publication was fortuitous insofar as it coincided with a report from Rolfe Winkler of The Wall Street Journal that Apple purportedly has been developing so-called BCIs, or brain-controlled interfaces, to assist people coping with motor disabilities. Moreover, I noted that my pal Chance Miller wrote for 9to5 Mac researchers strongly believe BCI has potential to ârevolutionizeâ the way(s) in which disabled people access computers. Miller also said Apple is expected to âadd broader support for BCIsâ to Switch Control later this year.
This bit of preamble is pertinent now because Millerâs colleague in Ryan Christoffel reports this week BCI maker Synchron, with whom Appleâs said to be collaborating on the technology, posted a video (embedded below) showing a man named Mark Jackson using Synchroâs BCI to control his iPad. Jackson, who has ALS, has been an early tester of the technology and was interviewed for Winklerâs piece for the Journal. Jackson is one of only 10 people to be fit with Synchroâs Stentrode implant for the FDA-approved trial. The Stentrode device uses electrodes to read brain signals and act upon them.
According to Christoffel, the description of Synchronâs video calls out a new API built by Apple, Human Interface Device (HID), which is designed to work with a BCI device. Synchron calls Jacksonâs demonstration âgroundbreakingâ in the way he ânavigates his iPad home screen, opens apps, and composes messages using only his mind.â
BCI tech, like time travel, feels like something out of a sci-fi novelâbut itâs real! This âmind controlâ tech truly does hold profound potential to bring greater accessibility to people who have severely limited, if any at all, motor skills. On a related note, Iâve long been fascinated by the work done by Elon Muskâs Neuralink for accessibilityâs sake, and would legit love to pick Muskâs brain on such a topic in an interview with me someday.
Teslaâs Ride-Hailing service Launches in bay area
Late last week, Elon Musk posted on X Teslaâs ride-hail service is now available in the Bay Area. The Tesla AI posted on X invitations for using the service are âgoing out now.â
Ryan Mense, writing at Bay Area-based CW affiliate KRON4, reported last week the aforementioned X post by Tesla AI includes a service map. Mense notes the service area includes âboundaries of Marin County and Berkeley in the north and San Jose to the south, [with] eastern and western boundaries cover cities near the bayshore.â Of particular import is the distinction that the Tesla ride-hail service is not driverless; there is a human in the vehicle during trips. In other words, thereâs no FSD mode in these cars.
Muskâs post comes after The Verge reported Tesla sought permission to operate here.
Iâve seen a few Tesla diehards post about using Teslaâs ride-hail service in Austin. Personally, I have no interest in trying it here in San Francisco; Waymo and Uber more than satisfy my needsâand, frankly, Iâm no fan of Musk. Nonetheless, between Tesla, Waymo, Uber/Lyft, and even Zoox, San Franciscoâas well as the Bay Area regionallyâTeslaâs news is yet another example of the regionâs place as a hotbed for innovation. In an accessibility context, the more app-based, on-demand ride-hail services in existence is undoubtedly a good thing for people who are, say, Blind and low vision and thus are precluded from driving on their own. The nerds (and venture capitalists) like to crow about the technical might of artificial intelligence, not to mention the coolness and convenience of summoning rides from oneâs iPhone, but the reality is itâs much more impactful than sheer coolness or even novelty. Using ride-hailing services offer accessibility and inclusion, as well as imbuing heightened feelings of self-esteem through greater agency and autonomy. As I always say, this stuff is 100% non-trivial.
Corporation for Public Broadcasting announces âResponsible and orderly closeoutâ amid Cuts
The Corporation for Public Broadcasting (CPB) on Friday announced its decision to begin âan orderly wind-down of its operationsâ as a result of the organizationâs recent loss of federal funding. The exclusion of CPB from the Senate Appropriations Committeeâs FY 2026 Labor, Health and Human Services, Education, and Related Agencies (Labor-H) appropriations bill was the first in âmore than five decades.â
Congress authorized the Washington DC-based CPBâs formation in 1967 to act as, as the nonprofit organization says, âthe steward of the federal governmentâs investment in public broadcasting. It helps support the operations of more than 1,500 locally managed and operated public television and radio stations nationwide.â Moreover, the CPB notes it is âthe largest single source of funding for research, technology, and program development for public radio, television, and related online services.â
Despite the extraordinary efforts of millions of Americans who called, wrote, and petitioned Congress to preserve federal funding for CPB, we now face the difficult reality of closing our operations,â Patricia Harrison, CPBâs president and CEO, said in a statement for the announcement. âCPB remains committed to fulfilling its fiduciary responsibilities and supporting our partners through this transition with transparency and care.â
According to the CPB, its employees have been notified that âthe majority of staff positions will conclude with the close of the fiscal year on September 30, 2025,â with a âsmall transition teamâ remaining in place through January of next year in an effort to âensure a responsible and orderly closeout of operations.â
âPublic media has been one of the most trusted institutions in American life, providing educational opportunity, emergency alerts, civil discourse, and cultural connection to every corner of the country,â Harrison said of the CPBâs raison d'ĂȘtre. âWe are deeply grateful to our partners across the system for their resilience, leadership, and unwavering dedication to serving the American people.â
The loss of the CPB is gut-wrenching for public, independent media, as well as for diversity and inclusion. Iâve covered the work of PBS Kids extensively several times over the last few years, covering programming and more that make up the confluence of disability, technology, and television. In fact, just last week I reached out to the network seeking color from its senior vice president and general manager in Sara DeWitt about what the Trump administrationâs budget cuts may mean for her team. Iâve interviewed her on numerous occasions, but PBS Kids declined comment this time. Be that as it may, one neednât get an on-the-record interview to know which way the wind is blowing; to wit, as with cuts to SNAP and Medicaid, disability inclusion and representation is taking it especially hard on the chin lately. In the case of broadcasting, that PBS Kids produces shows like Carl the Collector that not only put disabled people in the spotlight, it gives every child (and their families) crucial lessons in why showing compassion and empathy is important. With the CPB shuttering operations, that puts such educational opportunities in serious perilâto the detriment of society writ large.
Iâll report back if and when PBS Kids makes official statement(s) on these matters.
On âPersonal Superintelligenceâ And Accessibility
Earlier this week, Mark Zuckerberg published an essay about what he calls âPersonal Superintelligence.â The 600-word post comes on the heels of Metaâs spending spree of late to staff up its Superintelligence Labs group, during which Meta has poached several Apple employees working on AI. The seemingly ever-growing list notably includes the person who purveyed Appleâs foundation models, Ruoming Pang.
Pangâs defection to Meta was first reported by Bloombergâs Mark Gurman last month.
But back to Zuckerberg and artificial intelligence.
âIt seems clear that in the coming years, AI will improve all our existing systems and enable the creation and discovery of new things that arenât imaginable today,â he wrote. âBut it is an open question what we will direct superintelligence towards.â
Those who harbor more cynical inclinations towards Zuckerberg and Meta have, somewhat rightfully, labeled his latest manifesto as much ado about nothing. After all, Zuckerberg once was bullish on the so-called âmetaverseâ portending the future of technologyâin with a bang, but out just as quickly with nary a whimper. Personally, I too thought the metaverse was nothing more than a big bunch of hooey. That said, Iâm willing to acknowledge he does manage to plant a few kernels of truth in his piece.
The ânut grafâ of Zuckerbergâs post concerns wearablesânamely, glasses.
âThe intersection of technology and how people live is Metaâs focus, and will only become more important in the future,â he said. âIf trends continue, then youâd expect people to spend less time in productivity software, and more time creating and connecting. Personal superintelligence that knows us deeply, understands our goals, and can help us achieve them will be by far the most useful. Personal devices like glasses that understand our context because they can see what we see, hear what we hear, and interact with us throughout the day will become our primary computing devices.â
Personal devices that know our context is a sentiment which resonates deeply with accessibility. As a devout Apple user, itâs not hard to look at something like Vision Pro and, despite how cool and cutting-edge the headset is, it isnât hard to envision a shrinkage of its technology so as to fit normal-sized glasses. Apple knows this too, but you gotta start somewhere, so $3,500 buys you baby steps into the future. From a disability standpoint, the allure is obvious: Vision Proâs mixed-reality makes it such that software can be layered in the real world, literally in front of oneâs eyes. Even Appleâs Liquid Glass, it could be argued, was created partly with Appleâs accelerated roadmap in mind. Itâs a design language that seems (to me, anyway) ideally suited for products like Vision Pro and more. The dividends are of limited utility right now beyond sheer novelty, but think of Zuckerbergâs aforementioned glasses. Imagine, for instance, a future version of visionOS running on a pair of glasses similar to Metaâs own Ray-Bans that show you turn-by-turn directions in Apple Maps, incoming texts in iMessage, or even a personâs contact card when they approach you. For someone whoâs Blind or low vision, as is yours truly, it would be extremely accessible for Siri to say, âJosie is approaching, hereâs her informationâ if itâs hard to make out a personâs face and/or physique from afar. Maybe some of this information is relayed through AirPods, but the salient point is simply that, for certain things, a pair of âApple Vision Glassesâ would be more usefulâand more accessibleâthan using the iPhone in our pockets. Put another way, itâs why Apple Watch is such a capable satellite device today. To wit, beyond mere convenience, it can be far more accessible (and more convenient) for many disabled people to raise their wrist for notifications than reach for the phone in their pocket.
Maybe Zuckerberg is ultimately wrong in his prognosticating. Maybe the smartphone truly is the end-all, be-all form factor for mobile computing. I, for one, wouldnât anoint him (and by extension, Meta) to lead the charge on the next technological revolution. But assuming heâs right, at least in certain respects, his blurb here about the evolution of personal computing vis-a-vis glasses may well prove remarkably prescient over the next decade or two. Whatever happens, one thing will remain crystal clear disability community is ripe with technologists, and any advancements in technology will be embraced with unbridled enthusiasm if they help us better access the world we live in.
OpenAI Adds âStudy Modeâ To ChatGPT
OpenAI this week announced a new pedagogical feature to ChatGPT: Study Mode.
âChatGPT is becoming one of the most widely used learning tools in the world. Students turn to it to work through challenging homework problems, prepare for exams, and explore new concepts. But its use in education has also raised an important question: how do we ensure it is used to support real learning, and doesnât just offer solutions without helping students make sense of them?â OpenAI wrote of Study Mode. âWeâve built Study Mode to help answer this question. When students engage with Study Mode, theyâre met with guiding questions that calibrate responses to their objective and skill level to help them build deeper understanding. Study Mode is designed to be engaging and interactive, and to help students learn somethingânot just finish something.â
As to technical details, OpenI says Study Mode was built using âcustom system instructions weâve written in collaboration with teachers, scientists, and pedagogy experts to reflect a core set of behaviors that support deeper learning including: encouraging active participation, managing cognitive load, proactively developing metacognition and self reflection, fostering curiosity, and providing actionable and supportive feedback.â The behaviors, the company added, âare based on longstanding research in learning science and shape how Study Mode responds to students.â
Study Mode was âbuilt with college students in mind,â according to OpenAI.
Study Mode is, as ever, pertinent to accessibility as a de-facto assistive technology. While teachers and university professors are apt to loathe software like ChatGPT and its ilk because of the ways in which they ostensibly stunt the learning process by giving students an instantâand virtually infiniteâanswer key, the truth is such criticism goes only so far. In a disability context, ChatGPTâs new Study Mode could plausibly be a boon to, say, neurodivergent people with unique learning styles. Having ChatGPT help with prompting, et al, and coalescing information into a single space can be worth its weight in gold; it may be far more accessible for someone to keep track of the subject matter using ChatGPT than a bunch of flashcards or strewn about in various places. Likewise, someone with cognitive/motor/visual disabilities (or some combination thereof) may find Study Mode a more accessible methodology than juggling a trillion browser tabs. This scenario reminds me of an anecdote Iâve shared before, wherein Jenny Lay-Flurrie, chief accessibility officer at Microsoft, shared with me in an interview last year her neurodivergent teenage daughter found using the ChatGPT-powered Bing Search a more accessible tool for doing research when writing essays for her English classes.
As Iâve said, chatbots have utility. Theyâre not sheer conduits for lazy people to cheat.
Study Mode is available now to users on the Free, Plus, Pro, or Team plans. Those who are ChatGPT Edu users will get the feature âin the next few weeks,â OpenAI said.
Easterseals CEO: SNAP, Medicaid Cuts in Trumpâs âBig Beautiful Billâ A âDouble Whammyâ to disabled
Earlier this month, Liza Berger at McKnightâs Home Care Daily Pulse posted an interview with Easterseals president and chief executive Kendra Davenport in which Davenport detailed how Americans with disabilities are impacted by the Trump administrationâs One Big Beautiful Bill Act. President Trump officially signed the bill into law on July 4.
Easterseals, founded in 1919, is Americaâs oldest disability nonprofit organization.
Among the cuts in Trumpâs âBig Beautiful Billâ are substantial cuts to Medicaid and SNAP, or food stamps. The cuts, Berger said, amount to $900 billion and will, Davenport said, âdeal a punishing blow to many people with disabilities, including many seniors.â
âMany people with disabilities are reliant on [food assistance], especially seniors,â Davenport said about the massive budget cuts levied by Trump and his cronies. âIf that goes away, itâs a double whammy. Theyâre losing their healthcare. Theyâre losing their access to food and their assistance to be able to nourish themselves consistently. Those are big concerns, and itâs all hitting the same people, if you will.â
In addition to the SNAP cuts, Davenport sounded the alarm on the cuts to Medicaid servicesâwhich, here in California is known as Medi-Calâas well as so-called direct service professionals, or DSPs, who visit disabled people in their homes and help them with independent living. The Medicaid cuts have a collateral damage effect on these workers, as Berger notes they will receive less money per hour. For its part, Easterseals, according to Berger, âhas 70 affiliates and touches 70 million people including older adults and veterans.â The reduction in wages for DSPs, she added, may prove too much for the organizationâs affiliates to bear; the result of which is ultimately less of these professionals for members of the disability community to lean on for crucial support.
Bergerâs story is worth a read in its entirety. The Trump administrationâs cuts to such crucial services do, in my opinion, underscore societyâs general disdain for disability and disabled people. Weâre seen as less than human, extant primarily to serve as inspiration for overcoming adversity, the odds, and our own bodies. Itâs discouraging (and grossly ableist) but also entirely predictable. More broadly, these budget cuts also show the evil callousness of Trump and his sycophantsâa reminder in itself that the modern Republican Party is decidedly not the Republican Party of George H.W. Bush, who signed the Americans with Disabilities Act into law 35 years ago this month.
Technologically speaking, the SNAP cuts comes amid Uber announcing expansion of retailers who accept SNAP payment in UberEats. The less money disabled people have to spend on UberEats, the less food theyâll have to eat. Which goes back to my point in the previous paragraph: most abled people donât give two shits about people like me.
Gemini App Gets Remade âAudio Overviewâ player
A report from 9to5 Googleâs Abner Li this week brings with it news Google has added what Li describes as a ânice quality-of-life updateâ to the Gemini app on iOS and Android: the ability to generate audio overviews, replete with native playback controls. The feature is in version 16.27 of the Google app on Android, as well as Gemini on iOS,
âPreviously, tapping on a generated Audio Overview opened the file in your browser with a long URL,â Li wrote of the interface changes for Audio Overview. âYou could listen in that Chrome tab or download (and use the Files app) for an unwieldy experience. Now, the Android and iOS app, like gemini.google.com, uses a native player.â
Liâs story includes screenshots (on Android) showing asking Gemini to âGenerate Audio Overviewâ of various PDF files. In a broad scope, this functionality strikes me as conceptually similar to how, Forbes for instance, includes a button on webpages that people can click or tap to have an article read aloud to them. It effectively turns news stories into audiobooksâwhich, as Dr. Victor Pineda told me last year, were originally conceived by people in the Blind community. Ipso facto, the Gemini appâs new Audio Overview feature is, at its core, an accessibility feature. Beyond Blind and low vision people, the overviews may very well be a boon to, say, people who are strong auditory learners in grasping information. Likewise, itâs plausible someone with limited range of motion may find audio content more accessible than manually scrolling the aforementioned PDF document. Whatever the reason(s), itâs obvious the existence of Generate Audio Overview is as much about accessibility as it is purported convenience.
For its part, OpenAI has a Voice Mode for ChatGPT. I covered its Read Aloud feature last year, with my story including an interview with OpenAIâs Joanne Jang. She told me all about Read Aloud, as well as OpenAIâs philosophy on prioritizing accessibility for all.