How Vimeo and Case Jernigan Are Using their ‘Noggin’ to Market the MS Community
Across the expanse known as the tech industry, artificial intelligence is decidedly the technology du jour. AI-powered algorithms are particularly en vogue. Their effects are most felt on social media platforms like Facebook, Instagram, and X; the humans running these companies seemingly staunchly believe the robots are more intelligent (and efficient) at presenting users with information they’ll find interesting or relevant. Human curation still is very much a thing, though. As one example, Apple has fleets of humans who daily edit the App Store, Apple Music, Apple News, and other services.
Count New York City-based Vimeo as another fan of flesh-and-blood curators.
Last month, Vimeo marked National Developmental Disabilities Awareness Month by awarding a Staff Pick to 5-time Staff Pick honoree Case Jernigan for his latest animated short film called Noggin. The 7-minute movie, critically acclaimed at festivals such as Slamdance 2025, chronicles Jernigan’s journey as he goes through life while managing multiple sclerosis. For Vimeo’s purposes, the raison d’être for its Staff Picks initiative is to shine the proverbial spotlight on work from indie artists exactly like Jernigan—to wit, people who hail from marginalized and underrepresented communities whose work is oftentimes overlooked by algorithms designed to cater towards mainstream audiences.
According to Meghan Oretsky, who serves as lead curator running Vimeo’s Staff Picks program, the program has been around since 2008. For 17 years, she told me during a recent interview, Staff Picks has stood as “a collection of the best short films in the world” created by emerging, as well as established, independent filmmakers. Oretsky’s team sorts its picks into 10 categories, with her telling me “when we’re slotting films into the channel, we’re looking to fill each of those buckets.” Oretsky has been at Vimeo for 11 years, almost as long as the Staff Picks program has been in existence. She said 9 of those years have been spent in the trenches with the company’s curation team.
“I’ve been here for quite a while, so I know quite a bit about our platform,” Oretsky said.
When asked about Vimeo’s machinations for selecting its Staff Picks, Oretsky described it as a “really democratic process.” The curation team, she said, is “tiny” and uses a majority vote to determine which films receives endorsement by way of badging. Specific to Jernigan, Oretsky explained his first Staff Pick came in 2016, with Noggin depicting disability as something not “often heard in film.” To this day, Jernigan’s voice is one Oretsky and her charges want to elevate and one they still “absolutely love.”
Oretsky went on to say Vimeo institutionally “feels a responsibility” to amplify people whose stories “highlight the wide spectrum of human experience around the world.” She added it’s imperative to Vimeo that the company “surface stories you don’t hear enough of, but are true to a specific human experience.” There are odes to the LGBTQ people, Black people, and many more. Disabled people are amongst those highlighted.
“We [at Vimeo] think it’s really important to have special places for [these stories] on the platform,” Oretsky said.
Oretsky was effusive in her praise for Jernigan and for his work on Noggin.
“[Noggin is] like collage meets spoken poetry meets musical musings—it has [Jernigan’s] fingerprints all over it, which makes it really unique,” she said. “He uses literally his own voice to share his world in such a playful and warm and, in the end, an incredibly vulnerable stream of consciousness way. I really love that he uses hand-cut shapes with it… he literally has his fingerprints on it. You can see he’s created that himself with scissors. He creates this dreamy world somewhere between like, our collective reality as humans and his own personal vision as a person who lives with multiple sclerosis, I watched Noggin again, and there was one quote that I felt like summed up his voice. He says, ‘Your spine is spaghetti and your brain is a meatball and you’re on a bicycle on the top of a hill and you’re 4-years-old and you’re about to barrel down at light speed until you crash into the neighbor’s fence.’ It’s so relatable, but it’s also such a creative way of expressing what it’s like to be in his brain. Whenever an artist expresses themselves in their own way, [the audience is] always walking through the grooves of their brain. But when you’re watching Case’s films, it feels like you’re being led through a playground or a safari, and there are these Easter eggs you find every time you watch it. His films are never a one-time watch. They’re such a joy to experience, even when he’s sharing tough times suffering with his relapses with MS.”
In Jernigan’s own words, he described himself to me as a “storyteller at my core” with filmmaking representing the “culmination of my interests” around drawing, writing, and making music. He explained Noggin as coming to be after he received his MS diagnosis and started showing symptoms. He called the film a “coping mechanism,” something to move his hands and arms and “trying to just express myself without thinking too much.” It would be months later that Noggin began to assume a more concrete shape, but its essence, Jernigan reiterated, lies in its role as “[a] pure coping mechanism.”
“The film was made for myself… it was reassurance for myself,” he said.
When asked about what he wanted audiences to get out of Noggin, Jernigan told me his primary aim was “trying to get at that balance between sharing something very personal and very specific to me—the diagnosis and symptoms of a very rare disease—and telling something about that story.” He added through the artistry of the film—the music and movement, et al—he sought to share what he called a “universal poetry” which conveyed the universality of grief and struggle. “These [concepts] are things we have all experienced,” Jernigan said. “That was my goal: to land at that balance.”
Jernigan said Noggin’s aperture widened considerably from its original conceit over time, which he appreciates because its growth coincided with his own healing. His form of MS is most common and known as relapse-remitting. Medication helps, and the experiences varies from person to person, but the Cliff’s Notes version of Jernigan’s type of MS is there are periods of really bad flare-ups followed by periods of healing. As Jernigan’s healing began, his self-described “artistic instincts” began to activate and he could imagine how Noggin might play out and with whom he’d want to share it. Vimeo, he said, was the place he wanted the film to live such that “more people could see it if they were interested.” The company, he added, was “very kind” to award him yet another Staff Pick and thus promote it accordingly so it could broaden its reach.
Vimeo has been “great” to partner with, according to Jernigan. He called the platform the place where he “first fell in love” with independent animation. It was “always a goal” to someday earn a Staff Pick from Oretsky and crew after many films he admired and adored received the virtual thumbs-up from the curation team. All told, Jernigan expressed gratitude for the relationship he’s cultivated with Vimeo, saying the company has been “very good” to him over the years since first getting a Staff Pick.
For her part, Jernigan’s films are so resonant largely due to Oretsky’s personal belief that the medium “has a really powerful and unique ability to hold up a mirror [and show] what it’s like to be a human being and to feel validated and feel more alive as a human and feel like you’re not alone.” Indeed, Jernigan told me he’s heard from many fellow MS copers who express their deepest thanks to him for making Noggin and, to Oretsky’s point, making them see that they aren’t alone on their journeys. Vimeo, Oretsky said, provides a “lower barrier to entry” for people like Jernigan to “share what’s in his heart” with the wider world. The comments section of videos are a microcosm of the emotional aftermath of Jernigan’s films; Oretsky said Vimeo gets countless emails and other correspondence regarding the enduring impact of his work, as well as those like him.
“That’s a really special thing about our platform,” she said.
Jernigan detailed the profundity of the response to the fruits of his labor.
“One of the things I’ve been thinking about lately is several people with MS have reached out and people have said something along the lines of ‘You put something in a physical form—an artistic and musical form—I had thought about for years but didn’t quite know how to express.’ One of those people said there were thoughts she’d been harboring for 30 years… she saw and felt them in the film,” he said of feedback he’s received from people about Noggin. “That kind of comment [and] that kind of message meant the world to me. It meant everything to me because, with the film, I was trying to get at showing what [dealing with multiple sclerosis] was like without being overly singular or too tight. I wanted to make something that was loose and open, but really did show what the anxiety of [having MS] is like and what the pain is like. Hearing from people who have been in my shoes… it was amazing and incredibly humbling to hear.”
Looking towards the future, Jernigan characterized himself as an “ambitious” person and said some of the realities of his MS has forced him to reevaluate what he can feasibly achieve. His plan is to “do everything in my power” to continue making movies as Lon as possible, as the art is “my way of communicating and my way of learning about myself and the world I live in and a way to collaborate with people that I care about.” Moreover, he’s no Nostradamus, so predicting his future is impossible. What is possible is Jernigan listening to his medical team and carrying with him a positive attitude. He conceded some days are better than others, but Jernigan currently relishes being in “a new space where humor and optimism are quite readily available to me,” adding he’s acutely aware of the cyclical nature of his condition and copes best he can.
As to Vimeo, Oretsky’s hopeful to keep doing what she’s doing with her charges on the curation team and especially to keep “elevating stories of people with disabilities.”
Ubisoft Announces New Chroma Colorblindness tool Publicly Available
Video game publishing giant Ubisoft announced on Monday its open-source colorblind simulation tool, called Chroma, is now available to the general public. The announcement was made in a blog post bylined by the company’s Brittany Spurlin.
Ubisoft says an estimated 300 million people worldwide cope with colorblindness. Chroma, Ubisoft notes, is thusly designed to give developers the ability to “accurately replicate their experience and create accessible content to meet their needs.” The tool works by “applying a filter over the game screen without hindering performance.”
According to Spurlin, Chroma’s development began in 2021 when Ubisoft’s quality control team, based in India, sought to create a tool which gave real-time feedback and allowed developers to maintain control the game while simulating colorblindness. Furthermore, Chroma’s so-called Color Oracle algorithm can be used on a single screen or dual screens, as well as supporting hotkeys and featuring a customizable overlay.
“Over the past few years, Chroma has proven to be a highly efficient tool for us at Ubisoft,” David Tisserand, Ubisoft’s director of accessibility, said in a statement included with Ubisoft’s announcement. “It has allowed us to assess the accessibility of our games for colorblind players much faster and more comprehensively than ever before. Because we believe accessibility is a journey, not a race, we’re thrilled to share Chroma with the entire industry. We invite everyone to benefit from it, provide feedback, and contribute to its future development.”
Ubisoft has shared details on Chroma on GitHub.
Today’s news follows the announcement earlier this month from the Entertainment Software Association that Ubisoft, alongside fellow heavyweights Electronic Arts and Nintendo of America, have banded together in a “cross-industry effort” to comprise the new Accessible Games Initiative. The AGI, as it’s colloquially known, is dedicated to “[providing] players with information on accessibility features in video games.”
The New Apple TV+ Sale is a trojan horse to Disability Awareness For every subscriber
Tim Hardwick at MacRumors reported earlier this week about a “limited-time discount” currently in effect on Apple TV+ that sees the price of the nearly 6-year-old streaming video service drop to $3 for three months. The promotion runs through April 24 and is available to what Hardwick described as “new and eligible returning subscribers.”
After the 90 days, Apple TV+ costs $10 per month. It’s included in the Apple One bundle.
Apple’s offer is interesting from an accessibility standpoint in two ways. First, the bargain basement price tag is a great opportunity to access high-quality content with a minimal hit on one’s pocketbook. For budget-conscious buyers, a group of which the vast majority of disabled people are part, the savings (albeit brief) will be welcome if someone is trying to get the most bang from their entertainment’s buck. Moreover, the cost savings mean someone may finally be able to immerse in the cultural zeitgeist with shows such as Severance and Ted Lasso. It has the potential to be a real boon in a social-emotional context, as perhaps a disabled people with limited access to socialization—for health or other reasons—now has an opening to gab with family and friends about, say, the cliffhanger to end the Season 2 finale of Severance. At first blush, this may seem like a trivial thing; the reality is super-popular TV shows like Severance is the type of proverbial water cooler fodder that ostensibly “everyone” talks about, but which isn’t true for literally everyone. That Apple TV+ is inexpensive right now, however briefly, opens the possibility for said conversations to be more inclusive of more people.
Secondly, the $2.99 price point is an unbeatable entryway into what I believe to be the most earnest and genuine portrayal of disability representation in Hollywood today. For those in the disability community, the benefit is obvious: there are multiple pieces of content on TV+ which thrust people who look like them into the limelight. In an industry, and a society, which perpetually infantilizes and pities disabled people as moribund, that shows like See or Best Foot Forward or Little Voice—all shows which have long ceased development yet remain worthy watches—so richly depict disabled people as capable and human as anyone else is truly and profoundly endearing. For people without disabilities, these shows offer chances to see disabled people do cool stuff like, the case of See, Blind people being badasses in kicking ass. It stands in stark contrast to the typical inspiration porn disabled people are subject to feature in; we deserve to be the lead story instead of relegated to being the last one on many nightly newscasts.
I love Severance as much as the next person—and I was overjoyed this morning to learn The Morning Show returns this summer—but nonetheless love Apple TV+ largely due to the representational gains I reap from it. It means the world to me, as it is reaffirming of Apple’s institutional ethos on the accessibility and inclusivity of its computers. It isn’t an empty bromide. It isn’t pandering or performative. The company’s commitment to disability inclusion in Hollywood is as ironclad as its commitment to disability inclusion in its myriad software platforms. In terms of its entertainment arm, Apple is deserving of more adoration vis-a-vis TV+ for pushing societal justice and diversity forward. CODA, as another example, is so much more than “a sweet little movie.” Indeed, the film’s representational significance and critical acclaim are inextricably tied. That ought to be recognized by the mainstream critics who staff Deadline and The Hollywood Reporter.
Apple TV+ is assuredly much more, praise Kier, than simply the Severance Network.
Google’s April Pixel Drop Shows how AI Can Be more Transformative than trivial pursuit
As reported by 9to5 Google’s Abner Li, Google this week released its monthly Pixel Drop for Android. April’s edition brings with it but one lone feature: Gemini Live’s new Astra camera. Li writes the functionality is now available to all Pixel 9 phones, free of charge.
The “Astra” name refers to Project Astra, a DeepMind initiative from Google characterized by the Bay Area-based tech titan as “a research prototype exploring future capabilities of a universal AI assistant.” On its website, Astra is touted as helping people “explore [their] world like never before,” which is precisely what Gemini Live’s newfound camera is purported to do. To wit, after installing the software update, users will have gained the ability to point their camera at anything in the real world and have a conversation about it. In a brief video posted to YouTube (embedded below), Google demonstrates the Astra camera’s functionality by showing a person asking Gemini to assist with choosing between color options for their newly-glazed piece of pottery.
The notion that artificial intelligence can help people “explore the world” has deep resonance to accessibility, particularly to those in the Blind and low vision community. That a Blind Android user, for instance, could point their Pixel 9 device at some object—say, a street sign—and have it described to them has enormous potential to help make getting around one’s neighborhood more accessible—which, in turn, engenders heightened feelings of self-esteem and independence. Likewise, the Astra camera could be used in the kitchen at home, where a Blind or low vision person could create a makeshift Be My Eyes experience by pointing their phone at a carton of milk and asking about the expiration date on the label. These scenarios, both ostensibly ho-hum aspects of everyday life, are made eminently more significant by technology; whereas typically most, if not all, people with blindness or low vision need assistance to keep tabs on the milk in the refrigerator, they now can do so with agency and autonomy with the help of Gemini Live on their phone. That isn’t at all trivial, especially considering the prevailing—arguably ableist—perception societally that disabled people are rather hapless and helpless in our ability to function with any standard sense of normalcy.
Spiritually speaking, the Astra camera isn’t dissimilar to Apple’s Visual Intelligence on the 2023 iPhone 15 Pro and last year’s iPhone 16 lineup. In fact, the basic conceit is exactly the same: point your phone’s camera at something and the virtual assistant tells you about it. (Also highly similar are the myriad Detection Modes found within the Magnifier app on iOS.) How performant one is versus the other is largely immaterial in an accessibility context, as both are shining examples of the genuine good AI can do for the world. Tech companies, whether Apple or Google, like to hawk their AI-powered wares to the mainstream as conduits towards convenience. Though it is true something like the Astra camera indeed is convenient, its accessibility merit ought to be celebrated with more revelry. Whether Astra or Visual Intelligence, people choose whichever fits them best; the salient point is simply these technologies can be as transformative and life-altering for some as they are bleeding-edge and fun for others.
Finally, a few cursory thoughts on Google Gemini. A recent spate of sponsorships on one of my favorite nerdy podcasts pushed me to install the Gemini app on my iPhone and try it out. The app supplanted ChatGPT on my Home Screen, which I’d heretofore been using for my generative AI needs. It’s been a little over a month since switching to Gemini and I gotta say: I really like it. In terms of performance and acumen, Gemini seems to stand toe-to-toe with ChatGPT; I’ve noticed some errors and hallucinations, but Gemini mostly gives me what I want from it. It also helps that, as far as UI design is concerned, I’ve come to prefer Gemini over ChatGPT. Whereas ChatGPT feels staid and utilitarian, Gemini feels more “human” and whimsical in terms of its design. Pragmatically, Gemini has taken over much of the grunt work from Google Search in Safari in my usage. I find it easier to give Gemini a question or command, then use the in-app browser if I want to dive deeper. It feels much more accessible than using the browser and scanning through umpteenth search results—which, from a disability standpoint, exerts good amounts of energies in terms of vision and fine-motor skills.
My close friend Joanna Stern at the WSJ has similar feelings, writing in a recent column that she isn’t “going back” to the tried-and-true Google Search mechanics after using a slew of popular AI agents in testing. Stern’s lede says it all: “Somewhere out there, Jeeves is raising his teacup to the AI revolution.” Jeeves, as she notes, was the ‘90s-era ancestor to products like Google Search, Gemini, and more. Jeeves was limited in its capacity, Stern goes on to say, with the current crop of AI-driven assistants actually fulfilling the promise espoused by their ancestral digital butler. Stern’s focus isn’t accessibility—although we’ve talked a lot about it over the years—but her experiences underscore my aforementioned point about AI being more than sheer amenity or a vehicle in pursuit of trivia. As I said, to use, for instance, Gemini—whether the app on Astra camera—can prove far more accessible when conducting research. It’s a sentiment Microsoft’s chief accessibility officer Jenny Lay-Flurrie has shared with me at length numerous times. It’s better for her teenaged daughter, who’s neurodivergent, to use the ChatGPT-charged Bing to assist with essay research than use Bing in the more traditional fashion. Again, not an insignificant development for AI or disabled people.
But back to this month’s Pixel Drop. Of particular import, Li reported the Astra camera feature is not tied to the Gemini Advanced subscription service. Instead, the software is tied to Pixel devices themselves. This means Astra camera will remain available to users a de-facto accessibility feature in perpetuity regardless of their subscriber status.
Apple TV app for Android Gains support for Enhanced Caption Behaviors in new Update
Abner Li reported for 9to5 Google today about the recently-released Apple TV app for Android getting an update to version 1.1, which brings with it a couple clearly newsworthy features relevant to accessibility. To wit, Apple has added support for subtitles captions to automatically appear on screen when the TV’s volume is either muted or turned all the way down, as well as when a user rewinds playback 30 seconds.
As Li writes, both features have been stalwarts on Apple’s tvOS platform for years.
How this seemingly mundane app update is pertinent to accessibility manifests in two ways. First, that captioning shows up when the device’s volume is off makes perfect sense in an accessibility context. Without them, dialogue or music would be inaccessible. Likewise, the same effect when skipping back 30 seconds—a feature Li rightly notes is damned clever—makes dialogue accessible insofar as perhaps someone’s hearing, for example, couldn’t pick up on someone’s accent and/or whisper the first time through a scene. Put together, both pieces of functionality act as de-facto accessibility features: not expressly designed for disabled people, but nevertheless an accessibility aid to anyone who needs, in this case, captions for whatever reason(s).
Lastly, here’s another primer on the (albeit subtle yet meaningful) difference between captions. It’s a topic I’ve brought up before because, admittedly, it’s like nails on a chalkboard for me whenever I see the terms being used interchangeably. In the simplest terms, captions are for disability, whereas subtitles are for language. Captions make the spoken word accessible to those who are Deaf and/or hard-of-hearing; subtitles, on the other hand, make foreign languages more accessible to non-speakers. While it’s undoubtedly true both features do effectively the same job—flash words on the screen in an effort to maximize comprehension—the truth is they both are bespoke concepts that exist for different reasons. You may think I’m being a pesky-ass pedant, and maybe I am to some degree, but bear with me. Take CODA on Apple TV+ for instance. The entirety of the movie is subtitled under the rightful assumption not everyone who watches it speaks American Sign Language. Thus, the subtitles exist to accommodate for the “disability” of not speaking ASL. As a real-life CODA myself, I’ve watched the film many times and never need the subtitles because I can speak fluent ASL. I’m a rare breed, but therein lies the distinction. Captions help those who can’t hear, whereas subtitles help those who didn’t grow up using ASL as their native tongue.
Apple announced the Apple TV app (finally) arriving on Android back in February.
The HHS Layoffs Accentuate the precarity of Disability And the Antipathy of accessibility
NPR’s Joseph Shapiro reported last week about the layoffs at the Department of Health and Human Services (HHS) having collateral damage. Namely, not only are employees losing their jobs, but the federal agency is slashing labor on major programs such as Meals on Wheels. As the headline on Shapiro’s story makes clear, programs like Meals on Wheels is a lifeline for millions of senior citizens and disabled people in this country.
“At least 40% of staff got layoff notices and many were turned away at the front door Tuesday when they showed up for work at the Administration for Community Living, or ACL, which coordinates federal policy on aging and disability. That’s according to the agency’s former director under the Biden Administration, Alison Barkoff, who says she talked to multiple members of her former staff,” Shapiro wrote of the recent job cuts at HHS. “The agency funds programs that run senior centers and distribute 216 million meals a year to older and disabled people through the Meals on Wheels program.”
Similar to how oversight of special education services will (purportedly) be redirected once the Department of Education is fully dismantled, Shapiro notes Project 2025 recommends the aforementioned Administration for Community Living take up the task of overseeing programs such as Meals on Wheels. However explicitly unstated, the moves carry a strong undercurrent of President Trump and his acolytes running the government not giving a whit about Americans on the margins. The disability community in particular has forever existed at the margin’s margin; given what it took for us to secure our rights in the first place, these earliest days of Trump’s second term are as frightening—and soberingly reaffirming—as they are crazy and chaotic. The reality is programs such as Meals on Wheels is bonafide accessibility, albeit analog, because not every elderly person or disabled person can get out to the supermarket to buy groceries, then carry them home and prepare meals. My grandmother used Meals on Wheels for about a year before she died from complications of dementia in early 2007. She was unable to chew solids anymore by that point in her life, so her meals came to the house helpfully puréed such that all my uncle and I needed to do was feed her. It was accessibility in more ways than one: first and foremost, it obviously afforded her basic daily sustenance, but it also alleviated some of the caregiving load from us.
Call it fatalistic thinking, but the undoing of things like the Department of Education or, in this case, the Meals on Wheels program, provides the slipperiest of slopes to more dramatic and sinister action later. Even dating back to Trump’s first term, I’ve believed he and his cronies would take steps to repeal the Americans with Disabilities Act and thus institutionalize us were it possible and politically expedient. The White House clearly believes in flouting the law, so it wouldn’t surprise me one iota if they tried to weasel rationale for pulling back crucial protections for people like me—those who ostensibly provide the least value to the economy and to society. In other words, today’s Republicans aren’t the George H.W. Bush variety conservatives by a long shot.
Nintendo’s Switch 2 is giving this casual gamer serious cause to Assess accessibility
Consider this post complementary to the previous one published earlier today.
Earlier this week, Nintendo announced the hotly-anticipated Switch 2. The hybrid console, the progeny of the original model from 2017, costs $449 and is expected to ship on June 5. The Verge’s Ash Parrish posted a great rundown of the device’s details, particularly its nerdy and nitty-gritty technical specifications. To wit, Nintendo says the Switch 2 has a 7.9-inch LCD screen running at 1080p resolution and supports HDR10 and a 120Hz variable refresh rate. Moreover, the Joy-Con controllers have mouse-like functionality alongside 256GB of storage and a dock capable of outputting 4K video.
As someone who’s admittedly a decidedly casual gamer—albeit with three consoles: a PS5, Xbox Series S, and Mega Sg—I’ve long been curious by the Switch given its modularity and immense popularity amongst gamers and tech nerds. An OLED Switch has sat in my Amazon cart since time immemorial in part because (a) casual gamer; and (b) I wonder about the console’s accessibility. To that point, I posted on Mastodon a few days ago earnestly asking why Nintendo chose to ship the ostensibly newer and technically superior Switch 2 with a standard LCD display. Conventional wisdom dictates Nintendo would opt for OLED in the Switch 2, particularly given the company literally made a bespoke version of the original with that very display technology. But Nintendo clearly bucked convention, as IGN’s Wesley Yin-Poole reported this week the company decided to use LCD “after a lot of consideration.” Poole’s story includes comment from Tetsuya Sasaki, who serves as Nintendo’s GM of its technology development division and as senior director of its technology development department, who said in part “we took a look at the technology that was available to us now” and there’s been “a lot of advancements that have been made in LCD technology.” Sasaki’s rationale notwithstanding, it surprised and disappointed me to learn the Switch 2 omits OLED—a choice more bewildering considering my thoughts regarding fellow Japanese giant Sony’s refreshed TV lineup for 2025, also announced to much fanfare this week.
And rubbing even more salt into the proverbial wound is this post on X today from my sister from another mister in Kathryn Watson of CBS News wherein she links to a report that the April 9 date for pre-orders is being pushed back due to what Nintendo called “the potential impact of tariffs and evolving market conditions.” The company promises to give a further update on a new pre-order date “soon,” according to CBS News. This is even more confirmation President Trump’s terrible tariffs make zero sense whatsoever.
The Verge’s Sheena Vasani and Brandon Widder wrote a good explainer on pre-orders..
Anyway, back to accessibility. Nintendo’s big reveal of the Switch 2 this week has me oscillating whether I should try the new one or, in another nod to my casualness, opt for the original OLED edition. Whichever system I end up with, it’ll be a monumental decision insofar as it’s been over three decades since I played with a Nintendo system. Considering that timeframe, I’m keenly interested to see just how far accessibility in video games has progressed in 30 years. And given Nintendo is involved in the Entertainment Software Association’s new Accessible Games Initiative, it’s thus highly logical to presume the Switch 2 builds on the accessibility attributes of its predecessor.
Inside the eSA’s new ‘Common language’ for conveying accessibility in video games
Late last month, the Entertainment Software Association (ESA) published a press release in which it announced the formation of the Accessible Games Initiative. The Washington DC-based organization touted the so-called AGI was created in a concerted effort to “provide players with information on accessibility features in video games,” adding the Initiative is a “cross-industry effort.” The AGI’s members are a veritable who’s who of gaming heavyweights, including Electronic Arts, Nintendo of America, Ubisoft, as well as captains of the tech industry in Google and Microsoft.
Almost 191 million Americans “play video games regularly,” according to the ESA.
“The primary purpose of the new Accessible Games Initiative is to help provide clear information about whether a specific video game has accessibility features and, if so, what those features are,” the ESA wrote in its announcement. “The initiative is launching with a set of 24 ‘tags,’ all with clear criteria about what each tag means. Sample tags include: clear text, large and clear subtitles, narrated menus, stick inversion and save anytime, among others. Over time, participating companies will place the Accessible Games Initiative tags near their game product information—for example on digital storefronts, product pages, or digital catalogues. This can help provide players and other consumers, such as parents and teachers, with clear information about which features are in any given game before they make a purchase.”
In his own statement for the press release, ESA president and chief executive officer Stanley Pierre-Louis said in part the organization is “proud” to be shepherding the AGI while also saying it “demonstrates how impactful we can be when we work together in our industry-wide pursuit of helping more people experience the power of play.”
In an interview with me conducted via email earlier this week, ESA senior vice president Aubrey Quinn reiterated the language in the press release regarding the organization’s mission to “improve players’ gameplay experiences by providing consumers with clear information about the accessibility features available in specific video games” and added the use of the new tags is strictly voluntary. When asked what inspired the creation of the AGI, Quinn explained that although tens of millions of Americans cope with some sort of disability, there heretofore hasn’t been a “common language” which communicates the accessibleness of the games people play. Different platforms, she told me, have different interpretations of what, say, Large Text, really means in practice.
“Before the Accessible Games Initiative, the burden was on players to interpret existing tags in the marketplace from platform to platform, game to game. We hope to change that,” Quinn said of the driving force behind the AGI’s formation. “This new initiative is meant to help consumers identify specific accessibility features in individual video games, so that players buying games can make better informed purchasing decisions. Having a clear approach to identify accessibility features across different games, with criteria behind each accessibility tag, will provide consumers with information about the accessibility features they can find in games prior to purchasing them.”
In looking at the video game industry from a bird’s eye view vis-a-vis accessibility, Quinn told me the AGI is cognizant of the notion that “no one feature, initiative, or accessory is going to solve every accessibility challenge for every person.” Moreover, she said the AGI exists as but “one piece of the puzzle,” noting the industry writ large has made “significant advancements over the years” towards the betterment of more accessible and inclusive gaming. As with all things disability inclusion, Quinn and team are aware accessibility is an evergreen endeavor and thus “there is always more progress to make.” Indeed, accessibility is never-ending and a forever function.
“We know that companies across the industry are working day in and day out to improve the gameplay experience for players with disabilities,” Quinn said.
Quinn keenly emphasized the founding bodies of the newfound AGI thought it imperative that no one entity actually “owned” the accessibility tags. By contrast, the AGI members staunchly believe the tags will prove successful only if they’re embraced by the industry as a collective as opposed to an individual company. According to Quinn, it’s the ESA’s hope the tags will “encourage companies and developers to not only more frequently and easily identify the accessibility features in their games, but also look at the developer criteria and consider accessibility features they can build into their games from the start.” For transparency’s sake, it ought to be mentioned the AGI is overseen by the ESA. The organization described itself in its announcement as “the trade association representing the [United States] video game industry.”
Feedback-wise, Quinn said the early reception to the AGI has been “overwhelmingly positive” and noted the group is looking forward to seeing the tags begin being implemented. As critique comes in, and as new technologies emerge, Quinn said the AGI will respond accordingly by adapting, evolving, and adding new tags as necessary.
As to the future, Quinn said “everyone is invited to join us” in the AGI’s mission to make video games more accessible to all. She reemphasized her earlier point that, however elective, it’s the AGI’s hope the tags will be “widely implemented across the industry” while stressing a company needn’t be an codified AGI member in order to use them.
“The more companies, developers and storefronts use them, the more players will benefit,” Quinn said. “We also hope that players, developers and advocates alike provide us with feedback on the tags and how they are working for them.”
The full list of tags (plus more information) can be found on the AGI’s website.
Sony’s 2025 TV Lineup Reminds Image Accuracy shouldn’t Trump accessibility
Sony this week unveiled its refreshed line of Bravia televisions and soundbars for 2025. The Verge’s Sam Byford posted a good rundown of the new products. Of particular import, Sony announced the successor to its “King of TVs” in the widely acclaimed A95L OLED, on the market since 2023, with the Bravia 8 Mark II. The new model features a QD-OLED panel like the A95L and comes in 55- and 65-inch sizes, according to Byford’s report. He includes comment from Yosuke Nakano, who works on Sony’s product planning team, who proclaimed the forthcoming Bravia 8 Mark II features the “most faithful reproduction of the professional monitor look of any TV in the industry.”
No word yet on pricing, but Byford notes release will come “sometime this spring.”
I watched Sony’s introductory video for its new home theater products on YouTube last night. It’s a good watch, and struck me as extremely Apple-like in its production value and overall presentation. Like Apple, Sony seems extremely proud of its engineering prowess involving its custom silicon and panel technology for its TVs. And also like Apple, Sony’s TVs—particularly premium models like the A95L and Bravia 9—command a pretty penny, price-wise. There are ways in which Sony is technically superior to the competition, but what the company’s fans pay top dollar for is the user experience.
As an avowed home theater nerd, I have to admit I’ve never used a Sony television or soundbar in my life. In the same way Apple’s Pro Display XDR is my white whale desk monitor, the A95L (or 85” mini-LED Bravia 9) is my white whale TV. The Bravia 9 is especially fascinating because it gets damn close to OLED-level picture quality with great contrast and inky blacks all the while being bright enough to burn Icarus and in a larger-than-life size. The nerdy and disabled parts of me really wants to try this stuff.
Im the last few months, I’ve had two OLED TVs sent my way: the 48-inch LG B4 and the 77-inch LG C3. Both replaced TCL models in my office and in our living room, respectively. The larger C3 is wall-mounted and supplants the mini-LED 2020 6-Series which was itself widely praised. I watched the Chiefs-Eagles Super Bowl rematch on the new C3 and it was a real treat for my eyes. It was so good, in fact, that it ultimately sold me on this idea: OLED on TVs is to what the Retina display was on iPhone 4. What Steve Jobs said about the latter in 2010 applies to OLED: “Once you go Retina,” he said, “you can’t go back.” It’s so true; while I’ve used OLED on my iPhone and Apple Watch for years, not to mention the M4 iPad Pro I got for my birthday last September, having OLED on such large screens is an entirely different experience. I don’t think I could go back to ostensibly “lesser” display technology. With the exception of brightness—more on that in a minute—the picture quality of an OLED television is unparalleled, in my opinion. A major reason for this, of course, is the aforementioned contrast and perfect blacks thanks to the pixel-level control. That’s beyond even the beastly Bravia 9’s ken.
Which brings me back to Sony’s new lineup of TVs. Part of the reverence for Sony’s TVs amongst reviewers and A/V enthusiasts is that reference standard—Sony’s a master at achieving cinematic looks with a heavy emphasis on accuracy and preserving creative intent. If you like filmmaking as art, then Sony’s TVs are right up your alley. I can empathize with those sentiments because I personally do philosophically believe TV and filmmaking is an art form. As a practical matter, however, therein lies the tension. Despite reviewers’ incessant proclivity to focus on accuracy when evaluating TVs, whether Sony’s, LG’s, or anyone else’s, the problem is accuracy is prioritized at the expense of accessibility. As someone with extremely low vision, the more “accurate” picture modes and settings are way too dim and dull for my needs. Likewise, it’s a popular refrain amongst reviewers and Reddit goers to lambast Vivid picture mode because everything is juiced up to the max and thus wildly inaccurate. What this criticism fails to consider is choosing Vivid mode isn’t merely a matter of preference; it’s a matter of necessity. The reality is, I need the image on my TVs juiced up to the max in order to get any semblance of joy from them. From an intellectual standpoint, I acknowledge the downsides in terms of accuracy and fidelity—but accuracy doesn’t do shit for me if the imagery is inaccessible in the first place. And, frankly, I have neither the time nor patience to calibrate my TV or hire a calibrator. The purists of the world may recoil in horror. The fact is, for me, accessibility matters way more than accuracy does.
In a similar vein, accessibility is why I set my iPhone, iPad, and Apple Watch to max brightness and turn off auto-brightness. Like accuracy in TVs, the casualty on my mobile devices is battery life. But… it’s a tradeoff I must make in the name of usability.
Granted, nobody’s going to take a battering ram to my front door and arrest me for assault against television kind. The salient point is, as with everything else within tech journalism, accessibility is a sorely undervalued and unappreciated use case. Accuracy is all empirically well and good, but there’s little context behind it. By contrast, to value accessibility is to know what truly matters is disabled people feeling included in watching content as opposed to feeling excluded because the subject matter experts clutch their pearls over accuracy. What I’m saying is, accuracy shouldn’t be the end-all, be-all in modern television reviews. There’s more to a TV’s story than sheer accuracy.
The moral? Once again, advocacy of the disability community is downright Sisyphean.
How the nBA’s Sacramento Kings Make home games more accessible to every fan
Back in November, I posted a story to my dearly departed Forbes column in which I posited Apple Vision Pro has the potential to be even more accessible by improving the accessibility of live sporting events for Blind and low vision people. The impetus for my piece was news from 9to5 Mac’s Ryan Christoffel that Apple was in talks with Spanish soccer club Real Madrid to trick out the team’s Santiago Bernabéu stadium with cameras capable of streaming Apple Immersive Video for visionOS. As Christoffel explained, the inspiration for the collaboration was literal accessibility: Real Madrid’s home was at capacity with its regular sellouts and thus couldn’t accommodate more fans to attend matches. The idea, then, was the deep immersion of Vision Pro could enable fans who couldn’t physically be at the venue to virtually feel like they are there.
My two loves—sports and technology—converge. You see, as a lifelong sports fan who happens to be a lifelong disabled person, attending sporting events in person has always been a bittersweet experience. On one hand, the ambient experience of being at the ballpark or arena or stadium is super fun because… sports! On the other hand, however, people ultimately shell out big bucks for tickets to, you know, actually watch the game(s). As someone with extremely low vision, the problem with this dichotomy is obvious—which side of me do I indulge? More often than not, like in March of last year when in Las Vegas to watch the Pac-12 women’s basketball tournament, my closest cousins Jenn and her wife effectively had to text me play-by-play so I knew who was on the floor and what was going on. They also came to my (accessible) seat during halftime to check on me and ensure I was enjoying the festivities. Overall, it was a fun time; I did it a couple months later for a Bay FC home game down at San Jose’s PayPal Park—which, coincidentally, featured Nigeria-born Asisat Oshoala. The problem is, I don’t want the people I love feeling burdened by keeping me in the loop when at live events.
Enter OneCourt and its tactile tablet device.
On its website, the company boasts it’s enabling “the first tactile sports broadcast [by] transforming gameplay into trackable vibrations so fans with visual disabilities can watch the game with their fingertips.” According to OneCourt, 250 million people the world over “can experience sports intuitively and independently.” OneCourt notes it has received grants and awards from Microsoft’s AI for Accessibility Program, the University of Washington, National Science Foundation, and more. Additionally, OneCourt says it’s working “closely” with T-Mobile’s Tech Experience 5G Hub in an effort to “deliver a low-latency experiences in stadiums” for people with disabilities.
OneCourt’s tablet display features haptic feedback that allow Blind and low vision people to track the game(s) in real-time by way of audio and tactile vibrations. The software translates live gameplay into spatial feedback, giving a Blind or low vision person the ability to “watch” the game with only their fingertips. OneCourt worked closely with the NBA to ensure the immediacy of the experience, as the tablet is unhampered by the latency encountered by conventional radio broadcasts. OneCourt’s tech ensures an accurate, real-time depiction of the action happening on the floor.
Speaking of the NBA, the league’s Sacramento Kings—alongside their neighbors 579 miles north in the Portland Trail Blazers—have partnered with OneCourt and Ticketmaster to provide the former’s device free of charge to Blind and low vision fans who visit Golden 1 Center for a Kings’ home game. The partnership speaks to the organization’s ethos around inclusivity and enhancing the game experience for fans.
Put another way, the Kings’ rallying cry of “Light the beam!” is meant for every fan.
“The Sacramento Kings are committed to making live sports more inclusive, and our partnership with OneCourt and Ticketmaster is a major step forward,” the team said in a brief statement provided to me over email earlier this week. “This groundbreaking haptic technology allows Blind and Low Vision fans to experience the energy of the game in real-time through spatial vibrations and audio feedback. By providing these devices at Golden 1 Center, we’re reinforcing our dedication to accessibility and ensuring that all fans can engage with the excitement on the court.”
Yet despite the significance of the OneCourt device, it isn’t the Kings’ one and only nod to accessibility. Like the Pittsburgh International Airport, the organization has outfitted its arena with California’s first-ever sensory room in a sports setting. The room is festooned with texturized walls, soothing ambient sounds, specialized furniture, and more designed for those with autism and other neurodivergent people. What’s more, the Kings’ sensory room is welcome to anyone who needs a break—catering to those with dementia, PTSD, and like me, high anxiety. Additionally, the Kings offer so-called sensory kits consisting of noise-cancelling headphones, verbal cue cards, and more. Game-goers are able to check out the kits at Golden 1’s Guest Services Station, located at Sections 109 and 207. The room was developed in collaboration with KultureCity.
Netflix announces ‘More Multilingual’ Experience to improve accessibility On TVs
It may not seem so in the classical sense, but multilingualism is accessibility too.
It’s the reason subtitles exist. And no, subtitles and captions aren’t the same thing.
In the spirit of making Netflix more accessible, the company on Wednesday announced subscribers now have the ability to choose from the “full list” of more than 30 languages when watching content in its TV apps. Netflix says the functionality is “much-anticipated,” noting it receives “thousands” of localization requests a month from customers. The more robust language support on TVs means Netflix has reached feature parity, as the options have been available on mobile and the Netflix website. One of its most popular shows of all-time, Netflix said, is the Korean drama Squid Game.
“As more of our members enjoy shows and films from around the world—nearly a third of all viewing on Netflix is for non-English stories—subtitles and dubbing are more important than ever,” Netflix said of its language expansion in today’s announcement. "Our catalog includes a wide variety of TV shows and movies in over 30 languages, from critically acclaimed films to binge-worthy series, which lets our members immerse themselves in different cultures and languages from around the globe.”
As enormous and entrenched in the cultural zeitgeist Netflix is now, relatively few may realize the company cares a lot about accessibility. I’ve covered Netflix’s action in this arena at close range over the last few years; my reporting has included interviews with its director of product accessibility Heather Dowdy. She’s a trailblazer insofar as she’s the company’s first-ever executive in such a role. Dowdy is, like yours truly, a CODA.
Apple’s Accessibility Legacy Profiled On the latest ‘Twenty Thousand Hertz’ Podcast
Here’s a tidbit of (admittedly esoteric) tech trivia: Apple has worked on accessibility for 40 years. In fact, the voice the original Macintosh used to introduce itself in 1984 was powered by the Mac’s very first text-to-speech engine, cheekily known as MacinTalk.
An accessibility feature ushered one of the most seminal devices in computing history.
I was not even 3 years old when the Macintosh debuted. I was 4 when Apple’s accessibility department was born in 1985. How do I know these events if I were too young to remember them? It isn’t thanks to Wikipedia—it’s the Twenty Thousand Hertz podcast! The show, hosted by Dallas Taylor and described as “a lovingly crafted podcast about the sonic world for all ages [that’s an] upbeat, inclusive celebration of sound, covering everything from pop culture, to science, to history and beyond.” The latest episode is all about Apple and accessibility, particularly the innovations the company has created around audio and sound. A newsworthy mention on the podcast is the hearing aid feature in AirPods Pro that, alongside Apple Intelligence, was delivered to customers with last October’s much-ballyhooed release of iOS 18.1. Taylor’s narration is interspersed with interviews with Apple’s Sarah Herrlinger, Deidre Caldbeck, Ron Huang, and Eric Treski. CEO Tim Cook can also be heard in the episode.
I interviewed Herrlinger, Apple’s senior director of global accessibility policy and initiatives, back in December. She talked to me all about the aforementioned AirPods hearing health features, fittingly noting Apple has worked on hearing-oriented accessibility software “for decades.” She added accessibility, be it hearing or something else, is deeply ingrained in “everything we do” at the company because disability is part and parcel of the human experience. To that end, she said, the company wants its products to be accessible usable by everyone. This ethos goes a long way in explaining why Apple’s accessibility features are renowned in the industry for their unparalleled breadth and depth. It’s this purposeful acknowledgment of, and commitment to, such a diverse set of needs that make Apple Secretariat at the Belmont in the eyes of many in the disability community, as well as other accessibility aesthetes.
Beyond Apple, it’s worth noting the episode begins with a cursory look at how accessibility features have influenced history. Audiobooks, for example, are mainstream nowadays and immensely popular. Yet audiobooks were originally conceived to meet the needs of Blind people who couldn’t read the printed word. The anecdote reminds of my interview last year with disability activist Dr. Victor Pineda. He told me in part disabled people are technologists and accessibility is an incubator for innovation. His website makes that notion even more abundantly clear, saying in a really big font “inclusion is not just a buzzword; it is the very essence of innovation.”
The Twenty Thousand Hertz episode with Herrlinger can be found on Apple Podcasts.
DOGE’s Plans to Rewrite Social Security’s codebase Is yet another reminder disability Advocacy will forever be a sisyphean Task
My good friend Makena Kelly reported last week for Wired about Elon Musk’s Department of Government Efficiency’s, or DOGE, plans to “rebuild” the codebase used by the Social Security Administration. The current system used by the agency, which comprises tens of millions of lines of code written in the decades-old programming language called COBOL, is being targeted by DOGE to be rewritten in mere months. Kelly notes to safely rewrite the existing codebase would be a years-long endeavor.
The project seeks adopt a “more modern” language such as Java, according to Kelly.
The reason I’m covering my friend’s report is simple: the technological tie is obvious, but more pressingly, the collateral damage to overhauling Social Security’s codebase is the disability community. This is absolutely not theoretical; I’m a lifelong disabled person who currently receives Social Security benefits. I am, as Kelly writes, one of the 65 million Americans who are being served by Social Security’s programs. DOGE’s determination to ostensibly “better” Social Security’s operation with new code has massive potential to hurt disabled people by paying us incorrectly, if at all. Whatever the case, screwing with one’s monthly payments would have disastrous consequences for people like myself who rely upon on what we’re entitled to for everyday sustenance.
Social Security workers realize the precariousness of DOGE’s proposition.
“Of course, one of the big risks is not underpayment or overpayment per se; [it’s also] not paying someone at all and not knowing about it. The invisible errors and omissions,” a technologist at the agency said to Kelly in an interview.
Call it cynical or fatalistic, but I’d be willing to bet my monthly Social Security check Musk and Steve Davis, the Musk lieutenant charged with overseeing this undertaking, hadn’t anticipated or considered the ripple effects of such an enormous change. To put it more crassly, both men clearly didn’t give a shit about what could happen to those in the disability community if Social Security began to hiccup as a result of their reckless ambition. They not only have zero prescience, they’re heartless. More crucially, Musk and DOGE have no clue how government works, or why it works, and they won’t learn.
They’re too stupid, and too brazen, to understand, as Kelly wrote, even minor changes to Social Security’s system can “result in cascading failures across programs.”
Kelly’s story dovetails well with my piece from over the weekend about the newest American Experience documentary on the history of the Americans with Disabilities Act (ADA). In the film, disability activist Anita Cameron said “there’s a cost to our existence.” She meant it in context of the tangible accommodations the ADA mandated be built for disabled people, but those words are apt in context of Social Security too. For the vast majority of disabled people, not to mention senior citizens and children who are disabled, Social Security benefits are the literal cost of keeping disabled Americans alive. The money pays bills. It pays for food. It pays for caretakers. It pays for electricity and internet access. It pays for medication. In so many ways, Social Security benefits is the engine of the disability community; to arbitrarily rip out the agency’s spine that is its codebase is the proverbial throwing the baby out with the bathwater.
Kelly notes Social Security has pondered modernization before—just not like this.
Reading Kelly’s report served as yet one more sobering reminder that, despite the ADA’s unquestionable standing as landmark legislation on par with the Civil Rights Act, there is still so much further to go societally when it comes to the embracement of disabled people. Granted, the ADA and Social Security are separate things, but the tie that binds them is the people they serve. To wit, disabled people still remain on the margin’s margin when it comes to advocacy and representation. All the staunch DEI supporters embrace Pride Month in June for the LGBTQ community, yet there’s a pittance of the solidarity in July for Disability Pride Month. News organizations can hire reporters to cover narrow beats like AI, but are unwilling to cover accessibility because it’s misunderstood and won’t drive enough web traffic. TV shows like See on Apple TV+ are panned for being bad television, yet no one acknowledges the representational gains of Blind people being (leading) badasses. It’s why accessibility in tech is, and always will be, an evergreen cause—and really good journalism fodder for me to boot.
“Move fast and break things” is a nice mantra for startups. It isn’t nice for governmental entities like Social Security. Breaking things can very materially break people’s lives.
Smoke detectors are unsexy, but the nest Protect has the best accessibility Feature
In a report filed late last week, Jennifer Pattison Tuohy at The Verge beared bad news for longtime Nest aficionados like me: Google is discontinuing the Nest Protect smoke alarm, released in 2013. Tuohy points to a post on Google’s Nest Community Forum in which the company says in part it has ceased manufacturing of Nest Protect while noting it’s available for purchase from Google’s online store and third-party retailers “while supplies last.” Google attempted to allay fears of looming non-functionality by reassuring current Nest Protect users the device “will continue to receive security updates and continue to work as they always have through their expiration dates.”
Google is replacing the Nest Protect with the First Alert Smart Smoke & CO Alarm.
Although Tuohy characterizes the move as “this week’s least surprising news,” it certainly surprised me—but maybe it shouldn’t have as someone who’s clinging to the OG Nest app on my iPhone and iPad instead of (begrudgingly) embracing the Google Home app. I have a slew of older-but-still-perfectly-good Nest devices in my home, including the Nest Thermostat E, Nest Hello doorbell, the also-discontinued Nest × Yale door lock, and a couple Nest IQ outdoor cameras. None of them support neither Matter nor Thread, so they’re hooked into HomeKit through the delightfully simple and straightforward Starling Home Hub. I acknowledge there inevitably will come a day when all of these devices stop receiving software support, which will compel me to upgrade, but I will ride with them as long as possible. As I said, everything still works with aplomb and, frankly, both my partner and I love the Nest ease of use and design.
All of our Nest devices make our home more accessible to me—even the Nest Protect. However unsexy a smoke detector may be, what makes the Protect so endearing in use is its so-called Pathlight feature. Google describes the functionality as “[lighting] your way so you don’t need to stumble around in the dark. It’s ideal for any location in your home where you’d like a bit of extra light,” adding “it’s like a nightlight that turns on when you walk by your Google Nest Protect, and turns off after you’ve passed.” Pathlight, the company says, uses high-efficiency LED lights. Pathlight is a de-facto accessibility feature for me at night when I get out of bed and stumble down the hall to use the bathroom. There’s a setting in the Nest app (and Google Home) where users can adjust the intensity of the light, but I’ve left ours at the default Medium brightness. Pathlight makes it so much more accessible—and safer!— to get to and from the bathroom, as my vision in darkness and low-light is terrible. It’s one of those features ostensibly built for convenience that, in actuality, is equally beneficial for accessibility’s sake. It’s convenient that there’s a sensor in the Nest Protect up on the ceiling that automatically lights my way, but it’s arguably more important the light makes it more accessible to get up and down the hallway safely. Accessibility trumps convenience.
You should Get the pBS App this weekend
I have a couple loosely-related content recommendations for you, dear reader.
First, before bed last night, I watched the latest American Experience documentary about the history of the Americans with Disabilities Act (ADA). The hour-long film, called Change Not Charity: The Americans with Disabilities Act, offers a detailed look into disability inclusion in this country and the disability community’s (ongoing) fight for our recognizance and civil rights. The film includes a bunch of good interviews, including with renowned disability activist Judy Heumann and retired senator Tom Harkin (D-IA), who authored the Americans with Disabilities Act and was its chief sponsor. He delivered parts of his speech introducing the bill on the senate floor in ASL such that his Deaf brother could understand him. As to Heumann, she passed away in March 2023.
Watching the documentary obviously resonated with me; it was a poignant reminder of how very much spiritually and actively aligned I am with the missions of people like Heumann, Harkin, and more. Disability rights sits at the core of what I’ve done for the past dozen years as a technology reporter. Back in 2013, I carved out a beat, virtually all by my lonesome, covering accessibility and assistive technologies that I knew was risky because it was—and remains—abstract and esoteric and misunderstood by the majority of folks in the press. Believe me, disability decidedly isn’t a hot topic at tech desks in newsrooms; despite the proliferation of equally narrow beats like AI, social media, and media, accessibility sits at the narrowest of margins. With precious few exceptions, like my good friend at The Washington Post in Amanda Morris, disability simply isn’t a priority like race and sexuality when it comes to social justice reporting. I’ve lamented this before, as well as hawking my own wares on LinkedIn, both because I love working in journalism and, more pointedly, I’m utterly devoted to the representational cause. It’s precisely why, with a reporting résumé that includes interviewing Apple CEO Tim Cook, one of the feathers in the proverbial cap of my career I’m most proud of is interviewing Tony Coelho in 2020. Coelho, the retired Democratic congressman from California, copes with epilepsy (as my mother did) and is perhaps best known as being the ADA’s primary sponsor alongside the aforementioned Harkin.
It’s hard to believe the ADA turns only 35 years old come late July. Harder still, the ADA was signed into law by a Republican president in George H.W. Bush. I cannot fathom today’s ghoulish Republican Party, let alone Trump’s signature, even passing muster in Congress. As with other marginalized communities, it isn’t a stretch to surmise today’s Republicans give not even the slightest of shits about people like me. I’ve no doubt they would institutionalize us posthaste were it possible and politically expedient to do so.
Which brings me to my second recommendation. Capitol Hill has been in a tizzy this past week over the “Signalgate” scandal involving sensitive military operations and the accidental addition of Atlantic editor-in-chief Jeffrey Goldberg in the group text chat. Goldberg is host and moderator of one of my favorite nerdy shows in Washington Week which, like American Experience, airs on PBS. I watch it faithfully every Friday evening, usually in the living room over dinner. This week’s episode promised to be must-see TV given the show (a) covers the hottest political stories of the week; and (b) Goldberg himself is a central figure in the story. I thought it would be riveting television, and it turned out I was exactly right. To me, it was just as enthralling and entertaining as the Season 2 finale of Severance that, coincidentally, dropped just last Friday on Apple TV+.
Anyway, if you want good things to watch this weekend, the PBS app is your friend.
What the lumon terminal Pro says about accessible Computers, past and Present
In a break from the norm, Apple’s latest Mac doesn’t look to be very accessible.
Earlier this week, the company launched the Lumon Terminal Pro. The Terminal Pro is obviously a fictitious product and thus not actually for sale; it’s the machine Mark S, Helly R, Dylan G.—and presumably Tim C.—use to do their “mysterious and important” work as part of the Macrodata Refinement, or MDR, group on the cultural phenomenon show Severance on Apple TV+. On the Terminal Pro’s page, Apple links to a spoiler-filled video detailing how Severance is edited on the iMac, MacBook Pro, and Mac mini. It’s a good 12-minute watch, if not somewhat meta insofar as while watching it last night, I chuckled at the thought someone at Apple had to edit the video about editing video.
The other thought I had while watching the making-of video was how juxtaposed the Terminal Pro is to these aforementioned decidedly more modern Macs. To wit, seeing the Terminal Pro on the show evokes strong memories within me of my years in elementary school using the Apple IIe during computer lab. I played a helluva lot of Oregon Trail and Odell Lake on those things—even when I was supposed to be doing something else decidedly more pedagogical in nature. As a kid with low vision, I found the Apple IIe’s screen hard to see, despite the contrast being high with its green text against a black background. The full complexity of accessibility as a concept was intellectually lost on me at that age, although I suppose I had an innate sense of it given how I struggled to see the screen (and the glyphs on the keyboard). Were the Terminal Pro a real shipping product, my spidey sense strongly suggests how accessible it likely would be on par with my Apple IIe experiences of yesteryear—which is to say, not very.
By the same token, however, it’s remarkable how the technological advances of the last few decades have enabled me to not only enjoy computers in a nerdy sense, but to thrive with them in a productivity sense. Come May, I will have been an independent tech journalist for a dozen years, all of which have been spent from home and on the various Apple computers which have festooned the newsroom known as my desk in the little corner of my dining room. (Yes, I’m very privileged to have a house with a bespoke dining room.) None of my reporting would have been possible without, amongst other technical marvels, an accessible computer. My current machine is likely careening towards the end of its life come June, but it has treated me so well since July 2019. I’d love to know how many millions of words I’ve banged out on this thing over the years. It’s a testament to not merely how accessible the Mac is to me personally, but to Apple’s commitment to software support beyond older machines’ ostensible modernity. Sure, Apple Silicon is the shit, but my iMac is still steamrolling along on Intel and suits my workflow just fine. The 4K panel, in particular, remains glorious to look at every single day. What I’m saying is, the value proposition has been astronomical.
Beyond computers, it’s ironic in a way that, for all the praise I heap unto Apple for prioritizing disability inclusion in the company’s entertainment arm, Severance is absolutely one of my all-time favorite television shows. It has absolutely nothing to do whatsoever with disability, but I love it so much. I was admittedly late to jumping on the proverbial bandwagon, but I’m firmly riding it now. Although I subscribe to umpteenth streaming services, Apple TV+ is unquestionably my favorite not only because of Severance and For All Mankind and The Morning Show and Dickinson and Ted Lasso, but because it’s also home to what I believe is the most compelling catalog of earnest, genuine disability representation found in Hollywood today. As a lifelong marginalized and underrepresented person, I feel as though I can truly have my cake and eat it too.
Terminal Pro’s appearance on Apple’s Mac website is the latest marketing ploy for Severance. Back in mid-January, cast members appeared at a promotional event held at Apple’s Grand Central Station retail outpost ahead of the show’s Season 2 premiere. Parker Ortolani wrote about the event for his blog, replete with splendid photography.
I can’t wait for Season 3 of Severance, praise Kier. The first two are on Apple TV+ now.
Waymo Touts It’s ‘Laying the groundwork’ for Coming Washington DC Expansion
Waymo announced on Tuesday it’ll be “ready for riders” in Washington DC come 2026.
The Alphabet-owned autonomous vehicle company, which boasts it’s “the world’s leading fully autonomous ride-hailing service” and provides over 200,000 fully autonomous paid trips each week, said in its announcement it plans to “continue introducing ourselves to DC’s communities and emergency responders over the coming months,” as well as working closely with policymakers in an effort to “formalize the regulations needed to operate without a human behind the wheel in the District.”
“Waymo One is making fully autonomous driving a reality for millions of people across the [United States],” Tekedra Mawakana, Waymo’s co-CEO, said in a statement for the company’s announcement. “We’re excited to bring the comfort, consistency, and safety of Waymo One to Washingtonians, those who work and play in the city every day, and the millions of people from around the world who travel to the District every year.”
In reacting to Waymo’s expansion news, the American Council of the Blind (ACB) today posted on LinkedIn it is “excited” by the new development and is “proud” to continue its work with Waymo to “[push] toward a future of greater independence and freedom of mobility by making this technology available to the thousands of people in DC who are Blind or have low vision.” The ACB included in its post a photo of executive director Scott Thornhill and director of advocacy and governmental affairs Claire Stanley standing in front of a Waymo SUV. With Stanley is her guide dog named Tulane.
I’ve covered Waymo both extensively and from extremely close range over the last few years. They have service here in San Francisco, and I’ve been a Waymo One user since before the app went public. The best recommendation I can give Waymo is to say its app has cemented a place on the Home Screen of my iPhone 16 Pro Max. Without even the slightest hyperbole, Waymo has utterly revolutionized how I get around my city. While I continue to walk around my neighborhood for quick jaunts, anything further away and I’m using my phone to summon a Waymo. Likewise, while I continue to be a staunch proponent of public transit for myriad reasons, the reality is, for me, Waymo has proven itself far more accessible. The on-demand nature of it is arguably the best accessibility feature, with the fact I’m traveling alone coming in a close second. For better or worse, Waymo plays to my introverted nature; I needn’t have to worry about a crowded bus—where, even with my Blind cane in tow as identification, I’m not guaranteed a seat up front, however entitled—nor do I have to contend with a chatty Lyft or Uber driver. What’s more, Waymo’s technical attributes—its phone-centric design and self-driving technologies—are both seriously strong plays to my soul as an unabashed tech nerd.
To that point, the ACB’s sentiments about Waymo engendering greater freedom and independence for disabled people and their mobility resonate greatly with me. While I’m in no way proclaiming the company’s service is perfect, I am comfortable in making the proclamation that the anti-robotaxi brigade is myopic in its fervor. Safety and technical competence are obviously important; it’s disingenuous, however, to claim autonomous vehicles, be they Waymo’s or the competition’s, are utterly detrimental. To suggest so is to be willfully obtuse about their potential for genuine good. To wit, that Waymo has been so transformative for me (and others) isn’t the least bit trivial or niche.
Beyond San Francisco (and next year, Washington DC), Waymo offers service in Austin, Los Angeles, and Phoenix. The company plans to add Atlanta and Miami in the future.
Looming AirPods Max Update Reminds Accessibility Applies to hardware too
Apple on Monday made a surprise product announcement: the Cupertino-based company shared on its Newsroom website that, come iOS 18.4 next month, AirPods Max will gain support for lossless audio and ultra-low latency audio. In a corresponding move, the company today also announced the USB-C to 3.5mm audio cable. Apple notes the new $39 (!) accessory is capable of “[connecting] AirPods Max to 3.5 mm audio sources like the audio-out port on an airplane,” as well as connecting iOS and iPadOS devices to 3.5mm audio ports such as car stereos and other types of speakers.
With the forthcoming iOS 18.4 update, Apple boasts AirPods Max “will become the only headphones that enable musicians to both create and mix in Personalized Spatial Audio with head tracking.” The company adds AirPods Max will also provide “the ultimate listening experience and even greater performance for music production.” At a technical level, iOS 18.4 will enable “24-bit, 48 kHz lossless audio,” specifications which Apple says are key to “preserving the integrity of original recordings and allowing listeners to experience music the way the artist created it in the studio.” More than 100 million songs on Apple Music support high-fidelity, lossless audio, according to Apple.
Apple’s SVP of product marketing Greg Joswiak called it “the ultimate audio upgrade.”
As Jason Snell observes, the soon-to-come updates make for welcome news for AirPods Max users—although it’s somewhat perplexing why this stuff couldn’t have been announced months ago. Whatever Apple’s rationale for the delay, the reason I’m covering the ostensibly esoteric and unrelated bit of AirPods Max news is because it dawned on me while reading Apple’s announcement that I’ve never written about them in an accessibility context. I’ve mentioned having used them tangentially, but never have I expounded more fully on their accessibleness to me as a disabled person.
For starters, AirPods Max are heavy. John Gruber titled his 2020 review of the original perfectly when he said heavy is the head that wears AirPods Max. He was also spot-on when he wrote AirPods Max are “indisputably nice.” I got a blue pair of the Lightning version as a birthday gift a few years ago, and I wholly concur with my friend Gruber that AirPods Max are both heavy and super nice. I’ve mainly used them on my desk to listen to music and podcasts when my partner is home and I don’t want to disturb her. They’re also useful for putting on instrumental music for when I need to hunker down and be productive. (A current favorite for this is Linkin Park’s Papercuts greatest hits anthology.) Beyond that, however, I don’t travel with my AirPods Max; they’re simply too heavy and I’m not fond of the carrying case. By contrast, on Amazon’s Prime Day last summer, I snagged a pair of the Beats Studio Pro (in white) and was immediately impressed. To my ears, audio quality seems on par with AirPods Max. It’s hard to fathom, though, the same company that designed the AirPods Max “case” owns the brand that made a case for the Studio Pro that’s truly superior in every possible respect.
As a pair of headphones, AirPods Max are terrific. The sound is incredible.
I never bothered to swap my Lightning model for USB-C last fall because, aside from some new colors and a new connector, what I have on my desk right now is essentially the exact same thing. What’s more, from an accessibility standpoint, going from Lightning to USB-C is mostly a lateral move. For my lackluster hand-eye coordination—thanks to low vision and partial paralysis in my hands due to cerebral palsy—I’d merely be trading one inaccessible port for another. Although there is a cogent argument to be made that having one’s devices all using the same connector is more accessible from a cognitive perspective, where that falls off the rails is if you have multiple disabilities as I do. It would be more accessible to have my iPhone, AirPods Pro, and AirPods Max use USB-C—certainly more convenient too. But those benefits extend only so far if, as a practical matter, I struggle to plug the cable into the port. This is an issue I’ve written about innumerable times before: to be crystal clear, I can plug things in… just not without a good degree of friction and a grand degree of expletives while I’m doing it. As I’ve often said, the win is not merely to put USB-C in all the things, as the tech media likes to espouse. That isn’t innovative. What would be true innovation is, in Apple’s case, the company using its engineering prowess to somehow fuse MagSafe with USB-C. That the port is supposed to be an open standard and universal is (mostly) immaterial for accessibility purposes. As someone who’s predominantly an Apple user, I’d rather take the company’s proprietary spin on USB-C technology if it means AirPods Max end up being more usable by me. The salient point is simply that—surprise, surprise!—not everyone has the ability to plug things in so mundanely. A magnetic USB-C port surely would go a long way in shaping an even better user experience for myself (and others).
As ever, it’s the seemingly little things that make the biggest difference for disabled people. It’s why robust reporting on disability and technology matters so very much.
Microsoft Accessibility Exec Jessica Rafuse Talks Ability Summit 2025, More In Interview
March has been a big month for Microsoft in the accessibility arena.
Besides launching the Xbox Adaptive Joystick, the Redmond-based tech titan last week held its 15th annual Ability Summit. The event, which I’ve covered with regularity for the last several years for my old Forbes column, was hyped up in a blog post written by Microsoft’s chief accessibility officer in Jenny Lay-Flurrie. She noted this year’s edition of the Ability Summit featured a whopping 20,000 attendees hailing from 164 countries to gather together to discuss what she described as “the future of AI and accessibility.”
Microsoft, Flurrie wrote in the lede, is “innovating faster than ever before.” Disabled people, she added, are standing right there at the forefront, helping to lead the charge.
While I’ve sat down virtually with Flurrie numerous times for candid, on the record conversations about Ability Summit and much more, I spoke with Jessica Rafuse about this year’s event. Rafuse, who’s been with the company almost 9 years, is Microsoft’s director of accessibility strategic partnerships and policy and, like Flurrie, a disabled person herself. Rafuse copes with muscular dystrophy and uses a wheelchair, and is a mother of children with disabilities. She characterized her role as helping lead the company’s work with entities outside of the organization in its efforts to “accelerate accessibility.” The work, she added, is important because it gives Microsoft an opportunity to “gain feedback and insights” on how best to support its customers with disabilities. The company does this in three ways, Rafuse said: through building trust, spurring adoption, and leveraging AI. A lot of the so-called NGOs, or non-governmental organizations, are unaware Microsoft ships products—cf. the Xbox Adaptive Joystick—which are designed to help disabled people. In terms of AI, Rafuse said much of the work in building awareness is educating how AI can be a profound assistive technology.
As to Ability Summit, Rafuse called it an “all day event” for Microsoft, streamed online but also featuring an in-person component at the company’s headquarters. Company executives and community leaders, she added, came together to bask in the excitement of “[sharing] the latest greatest in accessibility and AI.” According to Rafuse, Ability Summit is “one of the few moments we all get together physically here in Redmond,” adding “we just celebrate a little of the work that our accessibility professionals are doing.” Put another way, Ability Summit is one forum in which Microsoft can deservedly put accessibility (and disabled people) in the spotlight.
“[Ability Summit] is a time for us to celebrate the work that is sometimes hard but always really rewarding,” Rafuse said of Microsoft’s chance to amplify accessibility.
When asked about the current state of accessibility in the industry and in society writ large, Rafuse said Ability Summit exists in large part to answer that very question. More pointedly, the sheer fact tens of thousands of people attended this year’s event—“more folks than we’ve ever had,” Rafuse told me—is a reflection of two bigger trends. First, she said, is the sociopolitical climate in the United States at the moment. There’s a lot that’s unknown right now; many people are looking to captains of industry like Microsoft to guide them through what Rafuse called the “regulatory unknown.” To that end, Rafuse said this year’s Summit saw an increase of European attendees, which she said was unsurprising given the looming European Accessibility Act scheduled to take effect come June. On its website, the European Commission describes the legislation as “a directive that aims to improve the functioning of the internal market for accessible products and services by removing barriers created by divergent rules in Member States.” There’s “an appetite” from Europeans, Rafuse explained, to glean Microsoft’s feelings on the coming bill. Beyond the European Accessibility Act, Rafuse said people are keenly interested in how artificial intelligence can influence and impact accessibility; attendees, she went on to say, are interested in how Microsoft is leveraging artificial intelligence in this realm, most particularly in responsible manners.
“There’s a lot of hope [around accessibility] right now,” Rafuse said.
Rafuse reemphasized the cruciality of community in furthering Microsoft’s accessibility efforts, telling me the company has a strong contingent of disabled employees internally. Again, she noted soliciting feedback is critical for the company, so a big part of the strategy is to partner with third parties externally. Microsoft engineers, Rafuse said, talk endlessly about building accessibility into design so as to be part and parcel of the developmental process. That necessitates “co-creation” with members of the disability community, she added, and Microsoft actively seeks out, and summarily cherishes, the feedback loop with the broader community. To wit, working on accessibility is decidedly not a solo endeavor—as the saying goes, it takes a village.
Overall, Rafuse said Microsoft loves Ability Summit because they love accessibility. It gives the company a platform atop of which to raise awareness—to educate and evangelize. Accessibility, whether in product or events, is “an ongoing process” for the company. Ability Summit is one thing, but then the proverbial page turns to the next big date on the accessibility calendar: Global Accessibility Awareness Day in mid-May. Put simply, accessibility is always on the minds of Rafuse, Flurrie, and scores of other folks who work on accessibility within Redmond’s walls. Rafuse told me this year’s Ability Summit was special because it was the first one to be held in person post-pandemic, as the last time was 6 years ago, in 2019 at Microsoft’s on-campus conference center.
“It was nice to see and meet with customers at Ability Summit in person,” Rafuse said.
Beyond introducing this year’s Ability Summit, Flurrie’s blog post included highlights of Microsoft’s work in accessibility of late. Amongst her callouts include Microsoft’s hardware shipping in accessible packaging, Copilot helping those in the neurodivergent community, speech recognition seeing a 60% improvement, and more.
Instacart Makes grocery shopping more personal, accessible with new AI Features
San Francisco-based grocery delivery company Instacart issued a press release earlier this week in which the company announced its so-called “Smart Shop” feature. The functionality, which Instacart says is powered by artificial intelligence, is touted to “[offer] more personalized recommendations based on preferences and provides clear nutrition and dietary information at consumers’ fingertips.” With Smart Shop, Instacart said, the company is “making online grocery shopping more intuitive by analyzing customer habits and dietary preferences to surface the most relevant products faster.”
Besides Smart Shop, Instacart introduced something it calls Health Tags. The product of a collaboration with the American Diabetes Association, Instacart describes the feature as “[providing] detailed and transparent nutritional information across the catalog, and Inspiration Pages, curated destinations within the Instacart experience featuring expert-backed health recommendations and shoppable recipes.” The company noted Health Tags, which it says uses evidence-based guidance and recommendations, is being released coincident with National Nutrition Month. According to Instacart, are purposely designed to assist customers with “[discovering] relevant products based on their unique health and lifestyle preferences on Instacart.”
“At Instacart, we want to turn the ordinary task of grocery shopping into a delightful, personalized shopping experience that takes the mental load out of finding the exact items that meet your preferences,” Daniel Danker, Instacart’s chief product officer, said in a statement for the company’s announcement. “By combining our new Smart Shop technology, Health Tags and Inspiration Pages, we’re not just improving online grocery shopping—we’re reimagining it, making it seamless to go from intention to action. By customizing your shopping journey to match your personal health goals or fit your dietary restrictions, we can unlock possibilities that weren’t even on the table before.”
Smart Shop and Health Tags “enable consumers to shop according to their unique dietary and household preferences, [which simplify] the process of finding relevant products and making more informed grocery choices.” As to Smart Shop specifically, Instacart says it’s powered by an “industry-leading catalog” of 17 million unique items, alongside a “proprietary dataset” comprising “millions of grocery shopping journeys.”
Over 70% have at least one dietary preference, according to a recent Instacart survey.
This week’s news of Smart Shop serves as a good reminder of how Instacart is beneficial to accessibility in multiple ways. I’ve covered the company in the past, and the most obvious benefit to using them is the fact someone does the shopping for you before delivering the goods to your doorstep. Although certainly convenient, the reality is Instacart can be a godsend to those in the disability community who are immobile—whether physically, logistically, or some combination thereof—and thus literally cannot (or should not) venture out to do their own shopping. This use case extends far beyond sheer amenity; Instacart’s existence means a disabled person has access to the things they need for day-to-day sustenance and survival. By the same token, Instacart’s guidance and recommendation engine is yet another example of AI’s profound potential to do genuine good for people. It’s entirely plausible these new features from Instacart helps a disabled person make better purchasing decisions so as to be in better alignment with whatever condition(s) they cope with on the daily. Moreover, from a cognition standpoint, the personalized information work wonders in alleviating the mental load of not only having a shopping list, but more crucially, finding the items within the store’s digital aisles. In-person grocery shopping isn’t a task for the weary.
Of course, the overarching reason for Instacart’s potential for greater accessibility lies in technology—specifically the smartphone. Disabled people use smartphones too, and Instacart’s app—available on iOS and Android—is one more tool with which to make life easier and more accessible. In very much the same ways Facebook makes socialization with far-flung family and friends more accessible, or how on-demand Uber and Waymo rides makes getting around town more accessible, Instacart can make getting groceries less burdensome due to the modern smartphone’s superpowers.