Do you use the Meta Ray-Bans?

Hey everyone,

Do you own the Meta Ray-Bans, and what do you use them for? I really like that I can walk and read signs if I know I am near a business, or store.

I use them every day. They aren’t ‘amazing’ accessibility, but that is only there side hustle!

I’ve found them to be very useful, and I think they will only get better with new software updates. I am hoping that they can allow for people detection, and better OCR though.

Yesterday, I was thinking about dinner - i knew I had a pouch of Microwave rice in the cupboard, but I wasn’t sure it was the right sort.

I walked into the kitchen, took the pouch out of the cupboard. I held it up and recited and incantation and a disembodied voice replied “this is a pouch of Tilda Wholegrain Basmati Rice.”

Fifty or a hundred years ago, doing this with a pair of sunglasses was science fiction, three hundred years ago it would have gotten me burnt at the steak.

Has there ever been a better time to be blind?

3 Likes
  I'm curious. Did anyone get an update to their glasses as recently as few days back? I got a notification saying your glasses have been automatically updated or something, but on opening the notification, there was nothing...
1 Like

I got an update last Friday, but I don’t know what the difference is.

Version 7 was in the past, who knows how long ago - life is crazy ATM!

Was it the one that introduced gestures? Yes, it was, I remember now, you can tap instead of saying ‘Hey Meta’ for example.

You can do that? I never realised.

I think you’ve always been able to use a tap instead of saying Hey Meta. The update allows you to do follow-up questions and had a few other things. I think someone created a topic on here with the release notes in it. After it updated I could see the release notes in the notifications area (there’s a button on the home screen). They have since disappeared as I’ve read them but could be worth a check if you didn’t tap them. This is in MetaView.

I wear the glasses every day but probably use the AI a few times a week. The other day. I can use them to tell me what’s in bottles, for example. It often gets it wrong and talks nonsense, but if it says something I recognise then I have reasonable confidence that it’s right. I used them the other day to tell me which of my bins was garden waste and which was recycling which is boring but made me happy. When I first got the AI I tried to get it to tell me what signs said but didn’t have much luck, but maybe it has improved.

I think the WhatsApp video calling thing is excellent. I’ve only used it once or twice but it was fantastic. I also like being able to record little videos. The audio on the videos is great and it’s nice to have some memories I can listen to. Then I can ask the pixies to describe them if I need more context.

I have noticed a few issues though. I get told a lot that it can’t help because it hasn’t got an internet connection. And sometimes that’s right, but at leat a couple of times I’ve checked and my phone has had 2 or 3 bars. Sometimes quitting MetaView and reopening it fixes the issue. I’ve often had a couple of occasions where the AI just breaks and goes quiet. Usually this is harder to fix but opening and closing the app, closing and opening the arms on the glasses and turning them on and off completely eventually fixes that one.

But I do find it frustrating when I go somewhere and try to use them only to be denied for whatever reason and it’s hard not to feel a bit abandoned. I should say I live in the UK and use a VPN to grant me access, then turn the VPN off and they then work for a bout a month before I need to repeat. It’s possible this doesn’t help with the connection issues.

I’ve also had some issues with magic tap. Suddenly my audio book on Easy Reader starts playing. I found when I opened Voice Vista that Hey Meta will break and just plays my audio book instead, like it’s doing a magic tap. It’s quite annoying. But no one else seems to have this.

But despite the niggles I love them. If some of these things can be tweaked and the AI improved then they would be even better.

1 Like