Latest news with #VR


CBS News
an hour ago
- Business
- CBS News
This Boston apartment building is using VR goggles to show homes to prospective buyers
VR technology being used to show apartments to people at one Boston building VR technology being used to show apartments to people at one Boston building VR technology being used to show apartments to people at one Boston building Virtual reality is a popular way to play video games, but now a Boston apartment building is using the technology for something a bit different: selling homes. The Ritz-Carlton Residences at the South Station Tower is using the goggles to show fully furnished homes that don't exist yet. Real estate agents for the tower can guide prospective buyers from anywhere in the world. The agents use a tablet to view everything that the VR goggles display, which allows them to virtually jump the user from room to room. "My favorite part is witnessing people experience it," says Manuel Davis. "I haven't not seen anyone say 'wow,' this is incredible.'" The mask is also equipped with safety features, including a red grid that alerts buyers when they may be approaching a real-life object. VR touring apartments The idea stems from a collaboration between the Ritz-Carlton, Williams Papadopoulous Designs and rndr, a VSN company. "I realized people couldn't really view 2D renderings and floor plans, and they needed to be able to see what they were going to buy without necessarily having to go there," the founder and CEO of VSN, Nate Robert-Eze, said. The condos are virtually furnished with high-end pieces, which Mark Williams, the founder of Williams Papadopoulous Designs, helps curate. "That's what's so important about this VR technology," says Williams. "In the virtual reality space, you really do feel it so much more than if you're looking at a two-dimensional rendering." Robert-Eze believes that VR is the future of real estate, allowing house hunters to view a space anywhere in the world. And the technology is constantly evolving. "We've built a software called Path," Robert-Eze told WBZ-TV. "That takes that immersive 3D environment and actually adds an AI component to it. The space becomes intelligent and you can start asking questions about the space, about the area." For more information about the Ritz-Carlton South Station Tower Residences and their VR technology, click here.


Forbes
a day ago
- Business
- Forbes
The Metaverse Was Suppose To Change How We Bank — What happened?
NANCHANG, CHINA - OCTOBER 19: A visitor experiences VR headset and hand controllers during an expo ... More of 2023 World Conference on VR Industry on October 19, 2023 in Nanchang, Jiangxi Province of China. (Photo by Zhu Haipeng/VCG via Getty Images) The metaverse was supposed to be transformative. In March 2022, Meta CEO Mark Zuckerberg called it 'the next chapter of the internet overall.' Three years ago, Satya Nadella, CEO at Microsoft, wrote on LinkedIn, 'The metaverse is here, and it's not only transforming how we see the world but how we participate in it — from the factory floor to the meeting room.' Back then, everyone was talking about it. And then, they stopped. The banking industry embraced the metaverse at the height of its hype, with some institutions even launching their own initiatives. Coastal Community Bank developed Coastal World, for example, a digital banking 3D game and marketplace, while Quontic Bank created an outpost in Decentraland. But, as it did elsewhere, the metaverse eventually lost its luster in financial services. These ecosystems still exist; they haven't been shuttered, but there haven't been any obvious updates lately. And rarely does the metaverse come up in conversations among bank executives anymore. So, what happened? Is the metaverse dead? Is there still any opportunity to be had? Overall, the technology industry took a hit in the last few years as macroeconomic conditions deteriorated and private capital contracted. This is likely at least partly to blame. But beyond that, the metaverse ran into two distinct challenges: limited utility and a lack of societal readiness. In short, there weren't any killer use cases (outside of gaming, which is quite niche) and people were not ready to, as Tesla and SpaceX CEO Elon Musk put it, strap 'a frigging screen to their face all day.' However, the metaverse opportunity may still exist — albeit in a less flashy, less monumental way. This is especially true in the context of two current 'hot' trends: artificial intelligence, including the rise of AI agents, and stablecoins. Here are a few ways that elements of the metaverse could play out as the banking industry grapples with the implications of these two technologies: Supercharged, generative AI-powered customer service is on the horizon. While the technology is not quite ready for primetime with bank customers, it's getting there. The form factor for such interactions today is a chat interface, but that may not always be the case. As agents transform into avatars, a metaverse-like environment may make more sense, especially if it takes the form of a virtual lobby or bank branch. Generative AI is already getting incorporated into learning and development initiatives at community banks. LemonadeLXP, for instance, is a learning platform that uses generative AI to help bank employees create trainings and courses. Over time, such materials could evolve into a virtual setting. According to CEO John Findlay, 'Historically, the issue with 3D experience training is cost. If AI makes 3D environments affordable, it could become an excellent tool for teaching soft skills and situation training such as customer service or sales, and leadership and conflict resolution.' Stablecoins are demonstrating potential utility in areas like online gaming, e-commerce, and cross-border payments. However, users need to be able to convert these digital assets to fiat currency and back again. Banks are well positioned to provide the on and off ramps necessary to facilitate these transactions. And that includes in virtual environments. Banks can also offer secure custody services. These are only a few examples — there are likely many more. Particularly when it comes to AI, any place where there is a chatbot today could benefit from an avatar in the future. And avatars need somewhere to live. As these virtual assistants grow more and more human-like, where they reside may grow more and more world-like. It's unlikely that the metaverse will regain its past momentum. Generally, once a buzzword dies, it stays dead. But that doesn't mean the spirit of the metaverse doesn't continue to present possibilities, for banks and for others. As society further experiments with advanced robots, digital currencies, and other new technologies, virtual ecosystems could emerge as an important mode for interaction. Most likely, though, they'll have a new name.
Yahoo
2 days ago
- Entertainment
- Yahoo
Beat Saber support is ending on PS VR and PS VR2
It's the end of the line for Beat Saber on PS VR and PS VR2. While you'll still be able to buy and play the base game on both platforms, as well as any songs and music packs that were released before today (June 18), Beat Games is winding down support for those versions. They won't get any new songs or music packs. As such, the final song that became available for Beat Saber on PS VR and PS VR2 was Lady Gaga's "Abracadabra." Moreover, the PlayStation versions of Beat Saber will lose their multiplayer features on January 21 next year. Beat Games says that it will still provide customer support to players on those platforms. "As we look to the future and plan the next big leap for Beat Saber, we have made the decision to no longer release updates for PS4 and PS5 starting in June 2025," Beat Games wrote in a statement on X and its website. "Our passion for VR remains unwavering. We are excited about the possibilities that lie ahead and what we can bring to Beat Saber fans who have been on this journey with us over the past seven years." This change doesn't impact the Steam VR version of Beat Saber. New songs, music packs and features are still coming to Beat Saber on Steam and Meta Quest platforms. Meta bought Beat Games back in 2019. In effect, the company is ceasing Beat Saber development on platforms that do not support its own headsets It's most likely that Meta and Beat Games are ending their efforts on PlayStation versions of Beat Saber because they're no longer seeing enough of a return on investment (though ending multiplayer support is an odd move). By all accounts, Meta Quest headsets have far outsold PS VR2 units. It could be argued that Sony hasn't fully gotten behind its own platform. By my count, there are fewer than two dozen PS VR2-exclusive games. The most recent State of Play stream featured only one (non-exclusive) game for the platform, Thief VR: Legacy of Shadow. So, it's maybe not surprising that Meta is pulling the plug on Beat Saber on PlayStation's VR headsets. It's still a shame though, as Beat Saber arguably remains one of the best VR games around, and maybe even the killer VR app. At this point, it might be best for PS VR2 owners who have a capable-enough PC and want more Beat Saber songs to pick up the PC adaptor and play the game on that platform. After all, Beat Saber is modable on PC (and Meta Quest), and there are thousands of custom song maps available.
Yahoo
4 days ago
- Entertainment
- Yahoo
Meta Adds VR-Based Avatar Achievements To Drive More Interest in Its Next-Level Experiences
This story was originally published on Social Media Today. To receive daily news and insights, subscribe to our free daily Social Media Today newsletter. Meta's working on a new way to spark more interest in its evolving VR experiences, with users now able to view a list of 'Avatar Quests' on Facebook and IG, which relate to various VR experiences. As you can see in this example, shared by app researcher Jonah Manzano, some users are now able to view an Avatar Quests tab in their profile options, which leads to a listing of achievements in VR games and worlds that can be conducted via your Meta character. So playing 'Kailju City Showdown' or 'Super Rumble Rocket Ruckus,' both VR titles, will grant you XP points for your avatar, essentially gamifying VR experiences in order to spark broader interest. And if Meta can get you to compete with your friends on completing these various tasks, in order to gain more XP, that'll enable it to drive more VR engagement, and potentially spark more interest by, say, showing your achievements to your connections. It's a pretty light means to drive more interest in its VR experiences. But as Meta gradually edges us towards its broader metaverse shift, it could be another helpful step. And yes, despite Meta shifting its focus to AI, and basically culling all mentions of the metaverse from most of its PR materials, after initial backlash to the concept, it is still very much on the cards for the future of its business. Meta's hope is that AI will empower the next stage of VR development, by democratizing VR creation. Which means that, eventually, you'll be able to create your own immersive VR experience, simply by speaking it into existence. It's already well on the way on this front, with Meta recently previewing its updated Horizon Worlds editor, which can now use AI prompts to generate VR objects. So if you want to create a craggy mountain range, which you can then explore in fully immersive VR, you can now do it, by simply typing in what you want to see. Meta's still developing this, but that's where its AI push integrates with its metaverse vision. And once everyone can create amazing, engaging VR experiences, that'll help drive more interest in VR more broadly, while its coming AR glasses will also play into that more immersive day-to-day environment, as we get more used to interacting via headset wearables than handheld devices. It all connects, it all aligns. And as such, the more Meta can drive interest in things like its digital avatars, the more it can boost interest in this next phase. Avatars may seem like a much smaller element in this respect, but when you consider how kids already use avatars to engage in the current version of metaverse experiences, in gaming worlds like Fortnite and Roblox, it makes sense that this would be the logical medium for Meta to drive its metaverse push. As noted, another small step, to a wider Meta-built digital world. Recommended Reading Meta Reduces the Price of its VR Headsets in Order to Maximize VR Adoption Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data


Forbes
4 days ago
- Business
- Forbes
Top Takeaways From Augmented World Expo 2025
Advancements in AR, VR and XR technologies are driving demand for new devices and software. The premiere show on VR, XR and AR occurred from Jun 10-12, 2025, in Long Beach, CA. Over 5,000 people attended the 16th edition of this conference to get updates on what was new and hot in augmented reality. One major shift for the show is how it has evolved its focus. In the last two years, the term "spatial" was highly promoted to define the show. Spatial was to embody the world of all things augmented. While the term spatial is still essential in the show promotions, the show this year seemed to embrace the term XR as the more accurate way to describe the fact that this show has fully embraced a VR, XR, (extended reality,) MR (mixed reality) and AR (augmented reality) world. Qualcomm, Samsung, Sony, Lenovo, Google, Xreal, Snap and Snap Spectacles, Unity, Pico, and over 200 others representing new XR hardware, software, and dedicated XR service providers had booths at the show. The show featured three days of conference sessions exploring every aspect of XR. More than 450 speakers took the stage, representing leading tech companies, Fortune 500 firms, and innovative startups. There were also over 250 sponsors and exhibitors. Thousands of attendees—including creators, developers, industry executives, founders, entertainers, investors, top media, and more—participated both in person and virtually. Topics ranged from enterprise case studies, the latest developer and creator tools, sales and marketing strategies, branded experiences, AR Cloud, WebXR, 5G, AI, Web3, haptics, privacy, and ethics to entertainment, education, and beyond. One standout observation: the quality of speakers at this event was truly exceptional. I attended one session on case studies for XR Training. The speakers were leaders of XR Training programs at Duke Energy and Volvo. Many sessions had speakers discussing the real-world usage of XR in their businesses today. One significant takeaway from this year's show was the role AI is playing in XR. The underlying theme of this year's event focused on marrying AI+XR into all of the sessions and exhibitions. There is a good reason for this, as AI has become a key factor in XR applications and services. In the past, most XR content was created by developers and special service providers using their code. But this year, these same people highlighted how AI is now empowering them to make more powerful XR solutions and speeding up the process of delivering creative programs and services for their customers. Another surprising part of the show was that many sessions stressed the importance of XR's human impact. Over the years, most of the talk at tech shows has focused on technology with little thought about how it affects people. However, Jason McGuigan of Lenovo said in his main stage presentation that we as an industry have to be more aware of how XR can and will augment the human experience. He pointed out that today, people see, feel, and taste their world with their five senses. But XR will add dimensions to that experience by giving them new information and experiences that augment their current world. He stated that the highly negative concept of a cyborg is really defined as technology enhancing a person's real world. Another major takeaway from this year's show was the strong focus on smart glasses. With the event's pivot to XR, smart glasses have emerged as a central theme and will likely continue to shape the show in the near future. While the XR market was previously dominated by VR, there is now a clear surge in interest in smart glasses. This was evident at AWE, where at least 20 vendors showcased new smart glasses and dozens of sessions were dedicated to smart glass technology. I was privileged to moderate a main stage panel on smart glasses and their future. I was joined by Ralph Jodice, GM of North America, Head of Partnerships & Publicity at Xreal, Kelly Ingham, VP AR Devices at Meta and Jason McGuigan, Head of Commercial VR, at Lenovo. Having these top executives on the panel allowed us to explore where smart glasses are today and where they will be in the next two years. These folks are authorities on this subject and play significant roles in their company's XR strategy and planning. All three agreed that in the next two years, we will see more exploration of new types of smart glasses with new styles and exciting features. These panelists explored the current types of smart glasses that have driven demand today. Meta has led the consumer smart glass revolution with its Ray-ban Meta Wayfarer glasses, which have sold over two million units. These represent AI smart glasses as Meta and others in this space have added AI audio feedback when using these glasses. The panel agreed that AI smart glasses will likely drive the strongest demand for these types of glasses in the next two years. The second category of smart glasses that is developing is one like Xreal's on the market. These use "birdbath lenses" and are optimized to deliver large-screen viewing experiences. Ralph Jodice explained that the newest Xreal Pro 2 now provides a 70-degree field of view experience and, when tethered to a device like a PC, smartphone, or mobile gaming device, allows you to view that content on what appears to be 100-200 inch screen through the glasses. They are optimized for watching movies, playing games, and using them in work environments. These glasses are shipping now. Mr. Jodice also stated that Xreal will support Android XR and release a new version called Project Aura in 2026. Xreal is also adding AI feedback to its new smart glasses. I also got to see Viture's new smart glasses, which are in this same category. Both companies are making great strides in developing even better versions of their products. Another significant player in this space is Snap, which has Snap Spectacles. Although very different from what Xreal and Viture are doing, Snap has created great smart glasses that are powerful for gaming and have all types of applications for consumers and businesses. The third type of glasses we discussed, which has a longer development cycle, are smart glasses with a video screen in the lenses so a person can get visual feedback when using them. There were many great sessions on optical lenses and the challenges of getting them to work well, and from the ones I attended, it is clear that breakthroughs in optical technology are needed to get this right. The panel felt that, by late 2026-2028, we will see more smart glasses come to market in this category. If you are interested in smart glass optical challenges, I suggest you visit Karl Guttag's KGOn Tech Blog for a deeper understanding of this subject. He had the best session on this topic at AWE, and his grasp of this issue is impressive. However, the panel discussed a significant topic: the future of smart glasses and its OS war on the horizon. Currently, Snap and Meta have two dedicated operating systems for their glasses. However, Google recently introduced Android XR, a new OS for smart glasses. With support from Samsung, Xreal, and others, this will become the third OS for smart glasses. We expect Apple to deliver its smart glasses and launch a fourth OS in the near future. In April, I wrote a column on Face Computing that sets the tone for what I see as a coming OS battle. To date, we have two major OS computing platforms: operating systems for PCs and Macs and operating systems for smartphones with iOS and Android. But I see the next big computing market will be around face computing, where our faces will be given the next significant way we deliver and work with information in the future. We are now laying the groundwork for the next personal computing battle, in which an OS and a software ecosystem will develop and drive the concept of wearable computing in the future. If history is our guide, we should see a huge push to get software developers to support one or two of these face computing OS platforms and start to build a significant ecosystem of apps and services for this type of wearable computer. If I am right about face computing being the next big thing in personal computing, AWE could evolve to become the main show for the industry within this category. As my panel of experts believes, we should see some remarkable new types of smart glasses come to market in the next two to three years. Disclosure: Qualcomm, Samsung, Lenovo, Google, Meta and Apple subscribes to Creative Strategies research reports along with many other high tech companies around the world.