Solos AirGo Vision
Rating: 2.5 StarsSkip To Our Expert ReviewRating: 2.5 StarsSkip To Our Expert Review
Pros
- AI assistant is smart
- Personalized information
- Swappable frames
- App is full-features
Cons
- Audio is sub-par
- Camera quality isn’t very good
- Controls can be hard to find
- Questions around privacy
Smart glasses are getting increasingly popular. Five years ago, you might have assumed that a pair of smart glasses would include an AR-style system for overlaying digital information on top of the real world. While companies are still working on that, for now, the better smart glasses are instead essentially devices that can hear and see the world around you and feed that information into an AI assistant that can help you perform certain. The most famous of these are the Meta Ray-Ban smart glasses, but other companies are working on the tech too — like Solos, with the Solos AirGo Vision glasses.
Solos has actually been working on smart glasses for a few years now, with the most famous of these being the Solos AirGo 3 smart glasses. These glasses integrate with ChatGPT and allow you to access information from the AI assistant quickly and easily. The Solos AirGo Vision glasses are very similar to these, but they swap out the regular frame for a frame that has a camera built into it, supercharging the assistant with the ability to see the world around you.
Of course, that doesn’t necessarily make the glasses worth using. How do they actually perform in the real world, and can they be genuinely helpful? I’ve been using them for a few weeks now to find out.
A mostly natural design
The best smart glasses are those that don’t reinvent the wheel when it comes to glasses design. The vast majority of people are self-conscious about anything that they’re going to wear on their face and as such, things that seriously stand out are unlikely to garner much attention. The Solos AirGo Vision glasses do a pretty good job of looking much like regular glasses, with the only indications of their status as a gadget being the tiny cameras and the thicker arms.
The glasses come in a few different styles, but they all look pretty good. The cameras are actually on the front end of the arms, slotting into place when the glasses are unfolded. On the right arm is where you’ll find the touch controls, which include swipe and touch gestures. You can swipe forward or backward for volume control or tap to play or pause music. You can hold down the circular touch button towards the end of the touch area to activate the assistant. The touch controls are easy to remember, but I found the touchpad tricky to locate, even after using the glasses for a while. That’s in stark contrast to the Meta Ray-Ban glasses, which use more physical buttons and have an easier-to-locate touchpad. Not only that, but given the fact that the arms and frame themselves can be tapped for controls, I found myself regularly accidentally triggering audio playback, and it got pretty frustrating.
The arms on the glasses can actually be removed, so you can swap the frame out for different styles. All of the brains are contained in the arms themselves, not the frame. An example of why you might want to do this is if you don’t want to always have cameras on your glasses, so they can look a little more natural and feel a little less intrusive for those around you, but it could also be useful alternative styles.
That said, I found that the design and build quality of the glasses weren’t necessarily as high quality as the Meta Ray-Ban glasses. This isn’t that surprising. It helps to have a company like EssilorLuxottica’s Ray-Ban building your glasses, but it’s still something to keep in mind.
Of course, not everyone will find the form factor to be the best way to access AI information on smart glasses. That’s especially true for those who don’t wear glasses in the first place, like myself. Sure, I wear sunglasses at times, but not all that often. This will, of course, change once smart glasses can actually start displaying visual information over the world around you, but for now, it takes some getting used to.
A well-designed app, and responsive AI
Of course, central to the experience of using these glasses is the AI Assistant and the accompanying app.
First, the app. The app allows you to control the different settings of the glasses, including camera settings like image resolution, EQ settings for the built-in speakers, and even touch control sensitivity. On top of that, you can access any photos that you’ve taken using the glasses, and perhaps most important of all, you can access your AI assistant, which you can not only to through the glasses themselves but also by typing in the app or through voice commands.
There are some features missing from the app when it comes to the AI Assistant called SolosChat. You can set things like voice speed and the style of response, whether it be more concise or more detailed. However, you can’t choose the voice itself, so you’re stuck with the single voice option.
That said, the AI assistant seemed to be reasonably helpful for some day-to-day tasks. SoloChat is powered by ChatGPT-4o, which inherently makes it smarter than most other AI services out there. On top of that, it can leverage the cameras built into the glasses, as well as your phone’s location, for more personalized responses. For example, you can ask it for suggestions on any restaurants nearby or details on how to get to a certain location. You’ll have to decide for yourself if you want to hand over information like your phone’s location, but some of these features can help make the glasses more powerful than the Ray-Bans.
Generally, I found the AI assistant to accurately pick up my voice the vast majority of the time, including when I was quieter or mumbling. That’s not all that surprising for someone who regularly uses the OpenAI Whisper model for voice-to-text, though. It’s very good.
All that said, I didn’t necessarily find all that many use cases for the AI. It was cool to ask it what it saw from the cameras, or to find out information about local businesses, but I didn’t necessarily find myself using it often enough to be useful, let alone carrying the glasses with me when I go out. To be fair, different users will find different use cases for an AI assistant like this, though I have seen reviews that note the reviewer finding plenty of use cases during the day, like asking the assistant for information during work without having to divert much attention from the task at hand.
I also have issues with how the glasses handle privacy. It makes sense that they ask for location access — at least in return, they can provide location-based information. It also makes sense that the app requests photo library access. But it does not make sense that it asks for access to all the photos in the photo library. When it first requested access, I only gave it access to the photos I selected. At least that allows the app to save photos captured by the camera. After that, it constantly requested full access to the photo library, and I have no idea why. I never gave it the requested access, and I suggest you do the same. There’s no added functionality from granting that kind of access.
There are situations in which SolosChat might be more useful, such as when you’re traveling. Like pretty much every single AI tool out there, the service has a translation feature built into it, and it seemed to work quite well. SolosTranslate integrates with the app too, so it can write out what you’re saying as it translates for the other person to read and respond to.
A low-quality camera that’s fine for AI
Their Solos AirGo Vision glasses have two cameras built into them, one on each side of the frame. These cameras are predominantly used to give Solos Chat visual information about the world around you, but they can also be used to simply take a photo.
You might not want to, though. I found the details that the AI assistant was able to find pretty impressive, but as far as camera quality goes, photos look pretty bad. Highlights are completely blown out, and details are lost in the shadows. You’re much better off pulling out your phone to take a photo, even if you have a budget phone with lower-quality cameras.
It was a little disappointing to see this, as while the Meta Ray-Bans don’t capture excellent images, they were far and away better than those captured by the Solos AirGo Vision glasses. It really wasn’t even close.
Sub-par audio
Perhaps more important than the cameras, however, are the speakers, and they left a lot to be desired too. That was especially true when comparing them directly with the Meta Ray-Ban glasses. Seriously, it was a night and day difference.
The speakers built into the Solo’s AirGo Vision glasses were tinny and thin, lacking depth, and frankly just sounding bad. To be clear, the speakers in the Meta Ray-Ban glasses don’t sound amazing, but they’re at least serviceable and have some bass to offer. Excellent audio may not matter all that much when you’re talking to an AI assistant, but if you plan on using the glasses for things like music at all, then expect subpar sound.
They were tinier than I expected, lacking bass and details in the high end. To be fair, most open-ear speakers, including open-ear earbuds, lack bass and detail, so this wasn’t all that surprising. But comparing directly with the Meta Ray-Ban glasses, I found that the audio was quite a bit worse.
Conclusions
The Solo’s Air-Go Vision glasses are a bit of a mixed bag. The AI assistant that they integrate with is pretty smart, and I like that it can offer personalized information based on things like location and the fact that it’s built on ChatGPT-4o.
However, other aspects of the glasses simply don’t live up to the competition. The camera quality is very low, and the speakers sound horrible, so you’ll end up wanting to use earbuds with the glasses instead of relying on the speakers built into them. If you like the idea of having access to ChatGPT instead of competing AI services and don’t really care about things like camera quality or speaker quality, then you may find these to be a solid pair of smart glasses. Otherwise, however, you should stick with the competition.
The competition
The biggest competition to these glasses comes from the Meta Ray-Ban smart glasses. There are things about the Ray-Ban glasses that are objectively better, like the speaker quality and the camera quality. Then there are things that you’ll have to think about for yourself. Not everyone wants a camera connected to Meta on their face all the time, let alone the use of a Meta AI assistant.
But, to be clear, the version of ChatGPT-4o built into the Solos glasses still passes through Solos — which also gains access to things like your location and any conversations you have with it. If you’re worried about the Ray-Bans for privacy reasons, it makes sense — but don’t automatically assume that these are any better in that regards.
Should I buy the Solos AirGo Vision?
Maybe — if you want smart glasses with ChatGPT, and can look past poor camera and speaker quality.
The post Solos AirGo Vision review: Smart glasses with ChatGPT appeared first on BGR.