Noteworthy II.

I used to pay a lot of attention to the MacArthur Foundation Fellowship awards (the “genius grant). I don’t pay as much attention these days, because reasons.

However, I did know that Josh Miele, who I have written about before, was one of last year’s recipients.

Here’s a pretty cool profile of Mr. Miele and what he’s doing now. In brief, he’s working for Amazon on accessibility.

For example, when Miele joined Lab126, the group was working on Show and Tell, an Alexa feature for Echo Show devices that uses the camera and voice interface to help people who are blind identify products. Employing advanced computer vision and machine learning models for object recognition, Show and Tell can be a vital tool in the kitchen of a customer who is blind or has low vision. A person holds up an object and asks, “Alexa, what am I holding?” and gets an immediate answer.

Miele helped the team understand that they needed only to provide useful context, even just a word or two, for a person who is blind or visually impaired to identify the product. The team focused on kitchen and pantry items — things that come in cans, boxes, bottles, and tubes. The goal: Recognize items in Amazon’s vast product catalogue, or if that wasn’t possible, recognize brands and logos that could give the customer enough information to know what they held in their hand.
“If I touch a can of something, I know it’s a can,” Miele explained, “but I don’t know if it’s a can of black beans or pineapple. So, if I’m making chili, and I open a can of pineapple, I’m going to be pretty irritated.”

“I realized that the work I was doing in accessibility was both rewarding to me and something that not many people could do at the level I was able to do it,” he recalled. “I thought, ‘There are plenty of people who could be great planetary scientists but there were not a lot of people who could design cool stuff for blind people and meet the needs of the people who were going to use it.’”

Comments are closed.