Snap AR Global is at the forefront of how Snapchat is dealing with privacy concerns, considering Metaverse, etc.

Snap on Thursday hosted its fourth annual Snap Partner Summit to announce new augmented reality (AR) features aimed at assisting users in their daily lives and not merely is to share interactive content on Snapchat. The company’s adoption of AR has helped it grow against Meta and YouTube. Some recent advancements in providing interactive experiences have also allowed Snapchat to be a differentiator in the app world.

At Snap Partner Summit 2022, Snap announced plans to help clothing brands and businesses build AR-based solutions to cut profits. The Santa Monica, California-based company is also making it easier for developers to create new AR experiences – what it calls Lenses.

To learn more about how Snap is integrating AR experiences more deeply in Snapchat app and how it responds to challenges including privacy concerns, Gadgets 360 spoke with Qi Pan, Director – Computer Vision Engineering at Snap, over a virtual call.

snap ar global head qi pan image Snapchat Snap

Snap Director of Computer Vision Engineering Qi Pan highlighted how it’s moving forward with AR
Image credit: Snap

Pan explains how Snap is building its AR shopping lens and plans to help developers create new interactive experiences using Lens Cloud. The CEO also talked about Snap’s stance on the metaverse – nascent technology space that happened recently see the entrance of companies including Meta and Microsoft. Below are edited excerpts from the conversation.

How is AR evolving at Snap? And what has been India’s role in that journey so far?

AR journey is amazing. Even when I joined the company, like five, five and a half years ago, we were talking to the company’s management and they were very clear that AR was going to be an important part of the consolidation process. Snap. To the outside world, it may seem like we’re getting more and more involved in AR, but internally, AR has always been very important to us in our orbit. And the reason for that is that we’re looking at a five- or 10-year horizon. We expect this shift from people using cell phones as their primary computing device to those using AR glasses. My team’s goal is really trying to unlock that value for the end user – trying to establish the technologies to enable new AR experiences, things like location-based AR where you can interact with own home, office or street. It also includes things like multi-user AR, where you can start benefiting from actually experiencing AR with someone instead of recording your own single AR experience and sending someone a video of it. That’s mostly what’s happening today.

If we’re thinking about social AR, then in the future, people are really reaping a lot of benefits from actually interacting in AR together. And that’s one of our Lens Cloud services. So it’s going to develop tools that will allow people to go beyond what lenses can do today, to really explore new use cases, things like utility, informational, all. all kinds of new use cases like that, because my belief is that these AR glasses should always provide value. They will have to make your everyday life better in some way. As for your perspective on India’s contribution in the journey, the development of this country is absolutely astounding. so, there 100 million users now in India, how wonderful. A lot of that growth is reflected on the AR side as well. I see that India is an interesting market, it’s a rapidly growing market. And it’s really important for us to understand the use cases in AR.

AR requires the user to open their camera – allowing the app to not only look at them, but also look at their surroundings. This may not be something comfortable for many users in the market due to factors including privacy concerns. So, how is Snap trying to convince people to open the camera on their device and experience AR – without hesitation?

One of the very unique things about Snapchat is that it actually unfolds in front of the camera. We see that the user is actually interacting with the camera. We have 250 million people playing with AR and open camera, playing with AR every day, which is a really amazing number. I think people already have that kind of behavior. Above ours Eyeglass, the approach we’re taking is trying to be really transparent about what’s going on, for example, even if you look back at the first few versions of the glasses we launched, which were being photographed with only the camera, we have made a very conscious effort to let others know if the camera is recording and we want to be really clear when the camera is recording and when it is not recording so that people around you feel comfortable, understanding what the hardware is doing. And the same goes for the new generation of glasses. These are the types of changes that are habitual, they are gradual changes that are taking place. So as soon as you can provide that real tangible value to people, they will be ready to use the camera to improve their lives.

Snap recently allowed users to pin AR experiences to their favorite places by brings in custom markers. How would you address privacy issues, if some users might try to abuse this feature – perhaps even to some extent unintentionally – by violating privacy of others and some real space?

We have taken a very cautious approach. We want the world to be covered in enjoyable and productive experiences. And so, any location-based AR lenses, like other lenses, go through a moderation process, and they must adhere to lensing guidelines, community guidelines, to ensure that the content is correct. suitable content. But yeah, I think it’s a really important topic, because with any tool that is adding information to this world, we really want to make sure the content is useful.

Snap is bringing in new AR lenses to let users try on outfits without having to change their clothes. How will the app deal with accuracy as size is a big concern in using virtual solutions to buy clothes and outfits?

A lot of them work with scale precision. If you look at our eyewear trial product category, it is using the front-facing depth sensor on the device to understand the proportions of people’s faces to estimate the size. So you have had a relatively accurate understanding of the size of glasses on your face. In addition, there is another layer of experience, which is visualizing objects in the world around you like visualizing a sofa or visualizing a handbag. And they are also envisioned to scale. You understand roughly the size of the sofa in your living room or the size of the handbag in front of you. But with clothing, this is a rather complicated area. The first generation of experiences will help people visually understand what the clothes will look like. But I think it’s a really important ability in the future to be able to help people choose the right size between the medium and large sizes, as that will help reduce the profit margin on orders. goods online.

Tech companies that started with the original VR and AR are now moving towards the metaverse. Are there plans to enter this nascent space with Snapchat in the future?

Yes, so reverse The term is used quite a lot in the industry to mean a lot of different things to many different people. At Snap, we focus on what value we bring to the end user. Also, one of the differences in Snap’s thinking is that we think the real world is a great place. Now our goal is not to get people out of the real world. We really wanted to enhance the real world in a small or important way. So our approach is really how we understand the world around us. And how can we make that better than taking you out into the real world and into this kind of metaverse.

Source link


News7h: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button