20 Sources
20 Sources
[1]
Meta CTO explains why the smart glasses demos failed at Meta Connect -- and it wasn't the Wi-Fi | TechCrunch
Meta Chief Technology Officer Andrew Bosworth took to his Instagram to explain, in more technical detail, why multiple demos of Meta's new smart glasses technology failed at Meta Connect, the company's developer conference, this week. Meta on Wednesday introduced three new pairs of smart glasses, including an upgraded version of its existing Ray-Ban Meta, a new Meta Ray-Ban Display that comes with a wristband controller, and the sports-focused Oakley Meta Vanguard. However, at different points during the event, the live technology demos failed to work. In one, cooking content creator Jack Mancuso asked his Ray-Ban Meta glasses how to get started with a particular sauce recipe. After repeating the question, "What do I do first?" with no response, the AI skipped ahead in the recipe, forcing him to stop the demo. He then tossed it back to Meta CEO Mark Zuckerberg, saying that he thinks the Wi-Fi may be messed up. In another demo, the glasses failed to pick up a live WhatsApp video call between Bosworth and Zuckerberg; Zuckerberg eventually had to give up. Bosworth walked on stage, joking about the "brutal" Wi-Fi. "You practice these things like a hundred times, and then you never know what's gonna happen," Zuckerberg said at the time. After the event, Bosworth took to his Instagram for a Q&A session about the new tech and the live demo failures. On the latter, he explained that it wasn't actually the Wi-Fi that caused the issue with the chef's glasses. Instead, it was a mistake in resource management planning. "When the chef said, 'Hey Meta, start Live AI,' it started every single Ray Ban Meta's Live AI in the building. And there were a lot of people in that building," Bosworth explained. "That obviously didn't happen in rehearsal; we didn't have as many things," he said, referring to the number of glasses that were triggered. That alone wasn't enough to cause the disruption, though. The second part of the failure had to do with how Meta had chosen to route the Live AI traffic to its development server to isolate it during the demo. But when it did so, it did this for everyone in the building on the access points, which included all the headsets. "So we DDoS'd ourselves, basically, with that demo," Bosworth added. (A DDoS attack, or a distributed denial of service attack, is one where a flood of traffic overwhelms a server or service, slowing it down or making it unavailable. In this case, Meta's dev server wasn't set up to handle the flood of traffic from the other glasses in the building -- Meta was only planning for it to handle the demos alone.) The issue with the failed WhatsApp call, on the other hand, was the result of a new bug. The smart glass' display had gone to sleep at the exact moment the call came in, Bosworth said. When Zuckerberg woke the display back up, it didn't show the answer notification to him. The CTO said this was a "race condition" bug, or where the outcome depends on the unpredictable and uncoordinated timing of two or more different processes trying to use the same resource simultaneously. "We've never run into that bug before," Bosworth noted. "That's the first time we'd ever seen it. It's fixed now, and that's a terrible, terrible place for that bug to show up." He stressed that, of course, Meta knows how to handle video calls, and the company was "bummed" about the bug showing up here. Despite the issues, Bosworth said he's not worried about the results of the glitches. "Obviously, I don't love it, but I know the product works. I know it has the goods. So it really was just a demo fail and not, like, a product failure," he said.
[2]
Meta's New Smart Glasses Got a Subtle Name Change. It Speaks Volumes About What's Wrong With Them
Having left competitors in the shade by adhering to vital lessons on what works on your face, Mark Zuckerberg has reverted to old habits with Meta Ray-Ban Display -- and that's not good. In the early days of Facebook, Mark Zuckerberg and his team famously adopted the motto "Move fast and break things." Posters with the phrase reportedly adorned the company's Silicon Valley headquarters, alongside other, similar sentiments like "Done is better than perfect," and "Fail quicker." The focus for the young company was clear: Being first was more important than getting it right first time. As the company grew, Zuckerberg -- in an interview with WIRED -- started to distance himself from, or at least temper, those mantras. But with Wednesday's announcement of the Meta Ray-Ban Display smart glasses, it feels like some of that old mentality might have started to creep back in. "Our goal is to build great-looking glasses that deliver personal superintelligence," said Zuckerberg yesterday at the very start of the Meta Connect event. He then immediately outlined some "clear values" that Meta holds sacrosanct for smart glasses. Number-one for Zuckerberg was "they need to to be great glasses first" with "refined aesthetics" that "shave every millimeter" from the hardware. No doubt Meta has shaved every millimeter it can from its new flagship specs, but in a rush to fully realize these next-gen glasses it looks like Meta has broken that primary value right out the gate. I got a chance to demo the Meta Ray-Ban Display, ahead of Meta Connect, at a preview event in London. The big news is they feature a small display built into the right lens that gives users visual prompts and guidance. They come with a wristband that understands hand gestures, which can be used to interact with the things displayed on the screen. Meta has labeled them the "world's most advanced AI glasses," and having tried them, it's easy to agree. They are undoubtedly impressive, and I think most people who get to try them will like them.
[3]
Meta Explains Why So Many of Its Live Demos Failed at Meta Connect
This week's Meta Connect smart glasses event was, let's admit it, a little cringeworthy. Or at least I cringed when two of Meta's live demos epically failed. (A third live demo took some time but eventually worked.) During the event, CEO Mark Zuckerberg blamed it on the Wi-Fi connection. Now we know what actually happened. Meta chief technology officer Andrew Bosworth addressed each of the live demos in an Instagram AMA (ask me anything) session on Thursday. Let's start with the first demo fail. Special guest chef Jack Mancuso, wearing the new Meta Ray-Bans Gen 2, was supposed to show off how the new Live AI feature could process the ingredients on the table, then give him a step-by-step guide to making a "Korean-inspired steak sauce" recipe. Things went wrong immediately. The Live AI started listing the ingredients laid out, and Mancuso interrupted to ask what he should do first. With the feature clearly glitching on the sequence of tasks, Mancuso asked again, "What do I do first?" Confused, the AI-powered smart glasses told him he "already combined the base ingredients" as if the cooking process had started, even though the bowl was empty and the ingredients hadn't been touched. Bosworth said that when the video of Mancuso instructing his glasses to use Live AI was shown to the live audience, "It started every Meta Ray-Ban's Live AI in the building." The Museum at MPK 21, where the in-person keynote address was given, can hold up to 2,000 people, so that's a lot of potential smart glasses. "That obviously didn't happen in rehearsal," he added. The problem worsened because the Meta team had routed all Live AI traffic to dev servers, overloading the system. "We DDoS'ed ourselves, basically," Bosworth said. DDoS stands for distributed denial of service, a kind of attack on a server that brings down a system by overwhelming it with traffic. The second live demo fail was when Zuckerberg tried to showcase the new integrated WhatsApp video calling and was unable to pick up a call from Bosworth. Bosworth said this was the result of a previously undiscovered bug that's now fixed. Zuckerberg's glasses' display went dark to go to sleep right as the WhatsApp call came in, and it didn't properly display the answer option. In a separate Instagram story, Bosworth added that he didn't see big risks in doing live demos. He said Meta's went "great" and that "live demos don't represent real-world scenarios." While he's right that smart glasses users are unlikely to be in those exact situations again, it's not a great sales pitch for an expensive device if the people who created it can't properly demonstrate how tools should work. CNET smart glasses expert Scott Stein has tested the new models and didn't encounter the same issues, which is a positive sign.
[4]
Meta's failed smart glasses demos had nothing to do with the Wi-Fi
Meta ran into some spectacularly embarrassing technical issues during the live demos of its new smart glasses this week, and now we know why. Andrew Bosworth, Meta's chief technology officer, explained in an Instagram AMA why two notable demos -- an influencer asking the AI assistant for cooking instructions, and Mark Zuckerberg attempting to pick up a WhatsApp call -- failed. "When the chef said 'hey Meta, start Live AI,' it started every single Meta Ray-Ban's Live AI in the building," said Bosworth. We had routed Live AI traffic to our dev server, in theory, to isolate it, but we had done it for everyone in that building on those access points. We DDoS'd ourselves, basically." So yeah, it wasn't the Wi-Fi, as Zuck claimed on stage. The video call issue was "more obscure," according to Bosworth, and involved a "never-before-seen bug" that occurred because the Display glasses had gone to sleep at the same moment that the device received the call notification. Bosworth says that the bug has now been fixed, but acknowledged the onstage demo was a "terrible place for that bug to show up." Even with the technical blunders, Meta providing an honest live product demonstration is a nice break from the usual fare of pre-recorded videos and generative AI errors that companies like Apple and Google showcase these days.
[5]
'That's Too Bad.' Multiple Live AI Demos Fail at Meta Connect
CEO Mark Zuckerberg gives a live demo on stage at Connect (Credit: Meta) Don't miss out on our latest stories. Add PCMag as a preferred source on Google It was a rough night for Meta after the company botched two onstage demos of its latest smart glasses at its annual Connect conference. The idea behind the specs is that a live AI assistant can see what's in front of the wearer and answer questions about it, allowing people to "stay present in the moment, while getting access to all these AI capabilities that make you smarter," says CEO Mark Zuckerberg. That's hard to do with glitchy tech. The first flop happened when an Instagram influencer, shown on the stage via a video call, asked the AI to help him prepare a Korean sauce. Before the demo started, Zuckerberg noted the company is still working through some "major technology challenges" regarding Live AI, particularly the ability to have access to it all day. To start, it'll be available for only an hour or two at a time. "Hey Meta, start Live AI," the influencer says. The AI accurately determines he has ingredients and empty bowls in front of him, and is ready to start cooking. After some small talk, the influencer asks the AI how to make a Korean-inspired steak sauce. The AI begins talking and the influencer interjects with a more specific question, "What do I do first?" The ability to interject and speak more fluidly to an AI, as someone would to a human, has been a focus of voice technology over the past two years. It's something OpenAI touts about its Voice Mode, Apple improved with Siri last year, and Amazon added to Alexa+. Meta's AI got tripped up by the interjection and did not register what the influencer said, so he repeated it. When the AI finally responded, it inaccurately assessed that he'd already combined some ingredients, which he had not touched. "You've already combined the base ingredients, so now grate a pear to add to the sauce," says the robotic female voice. The would-be chef awkwardly smiles, taps the glasses to reset, and asks," What do I do first?" again. The crowd begins to laugh. "Sorry, I think the Wi-Fi might be messed up," he says. "Back to you, Mark." Zuckerberg also blames the Wi-Fi, but keeps a cool head for the most part. But the second time the tech flubs, he was visibly upset. This time, Zuckerberg performed the demo himself, showcasing the new neural band technology, which he calls a "huge scientific leap." The glasses pair with a wristband, Meta's innovation to replace a keyboard and mouse for smart glasses. It interprets small muscle movements in the wrist and controls the glasses. A main use for the neural band is sending text messages, which Zuckerberg calls "one of the most important and frequent things we do on our phones." He pulls up a text thread on his glasses, which the crowd can see, and shows how he can "type" by writing the letters on a table with his hand. The glasses interpret his hand motions and input the text in the chat box. Things go awry when Zuckerberg attempts to answer a call on his glasses with the neural band by tapping his fingers in a quick motion. He has Meta CTO Andrew "Boz" Bosworth call him on WhatsApp. The call comes in on the glasses, but Zuckerberg cannot successfully answer it with his fingers. "That's too bad. I don't know what happened," he says after the first attempt. "Umm...maybe Boz can try calling me again." He gets a second call from Boz. "Alright, I'm going to pick this up with my neural band," Zuckerberg says, spastically moving his fingers in a last-ditch effort, but it fails again. "It happens," he acknowledges. Boz tried calling four times, and Zuckerberg concluded it was his fault: "I keep messing this up." Boz then called for a fifth and final time, and Zuckerberg still could not answer the call with the neural band. Defeated, he concluded," I don't know what to tell you guys. We're just going to go to the next thing, and I hope that will work." Luckily, the next time Zuckerberg tries to use the neural band, by quickly tapping his fingers together to pull up a song on Spotify, it works. Neither Meta nor Zuckerberg has commented on the demos post-event, as Zuckerberg did earlier this month to acknowledge another awkward moment when he was caught on a hot mic at the White House.
[6]
Meta CTO explains the cause of its embarrassing smart glasses demo failures
When Mark Zuckerberg announced Meta's latest smart glasses at the company's Connect 2025 keynote, he encountered two glitches that prevented him from properly demonstrating some of the devices' features. Now, Meta's Chief Technology Officer, Andrew Bosworth, said in an AMA on Instagram that they were demo failures and not actual product failures. The first glitch took place in the middle of a live demo with a cooking content creator, who asked Live AI for instructions on how to make a Korean-inspired steak sauce on his Meta glasses. Instead of giving him detailed instructions, his glasses' AI skipped ahead by several steps and continued glitching. The chef told Zuckerberg that the "WiFi might be messed up" in the venue. Bosworth said, however, that it was not the case. Apparently, when the chef said "Hey Meta, start Live AI," it fired up every single Meta Ray-Ban's Live AI in the building. And since the event was all about the company's smart glasses, there were a lot of them in the venue at the time. The company had also routed Live AI's traffic to its dev server to isolate it, but it ended up routing the Live AI traffic of everyone's glasses in the building to its server. "We DDoS'd ourselves, basically," he said. He continued that it didn't happen at rehearsal, because there weren't as many people wearing the glasses when they tested it out. Zuckerberg also ran into an issue when he tried demonstrating taking WhatsApp video calls on the Meta Ray-Ban Display. The audience could see him getting calls on the glasses' HUD, but he couldn't answer them to start the call. Bosworth said that it was caused by a "never-before-seen bug" that had put the display to sleep at the very instant that the notifications came in that someone was calling. Even after Zuckerberg woke up the display, there was no option to answer the call. The CTO said Meta had never come across that bug before the demo and that it has since been fixed. "You guys know we can do video calling... we got WhatsApp, we know how to do video calling," he said, but admitted that it was a missed opportunity to be able to show on stage that the feature actually works.
[7]
Meta's smart glasses demo goes off script in cringe-worthy tech fail, and here's why
If you were watching Meta's Connect conference this week, you saw some live tech demo going spectacularly off the rails. The kind where you half-watch from behind your fingers. The grand unveiling of its new AI-powered smart glasses was supposed to be a mic-drop moment. Instead, what actually happened became an instant headline for all the wrong reasons, with presenters repeatedly asking the glasses for help, only to be met with dead air and a whole lot of nothing. The first hiccup came during a cooking demo. Influencer Jack Mancuso tried to follow step-by-step guidance from the glasses to whip up a Korean-inspired steak sauce. Instead, the AI assistant went haywire and gave instructions out of order, creating a confusing mess rather than culinary magic. The immediate, easy guess? Spotty Wi-Fi. It's the classic scapegoat for every failed live demo since the dawn of the internet. But Meta's Chief Technology Officer, Andrew Bosworth, isn't going with the easy answer. In a candid series of explanations on Instagram, he detailed the actual, far more interesting cascade of failures that led to the on-stage silence (via Engadget). When smart glasses get too smart for their own good Behind the scenes, the glitch stemmed from what appeared to be a resource overload. Activating the Live AI feature didn't just pull information for one device but instead triggered every paired pair of Ray-Ban Meta glasses in the building, overwhelming the system. As if that wasn't enough, a rare bug caused the heads-up display to go dark during a follow-up live video call demo. When Zuckerberg tried to connect with Bosworth, the call didn't show up, leaving him visibly frustrated and apologetic in front of the crowd. Bosworth described the incident as a kind of self-inflicted DDoS, joking that the company had essentially jammed its own network with too many requests at once. He noted the rare video bug that put the display to sleep mid-call is now fixed, but the fiasco underscored just how challenging real-world AI-powered AR can be. Despite the demo hiccups, Bosworth stressed that the smart glasses themselves are fully functional. The demo glitches were just backend hiccups: the kind of complex, early-stage tech integration problems that happen when you're building something new. It's a blunt reminder: even with massive resources, developing AI-powered AR is hard, and unexpected bugs are part of the process.
[8]
Meta fail: Here's why Mark Zuckerberg's Ray Ban Display demo didn't work
As everyone from Tim Cook to Bill Gates will attest, live demos of new features and products at a launch event are fraught with potential pitfalls. Those little goblins made themselves known at Meta Connect, when Mark Zuckerberg tried two different features of the new Meta Ray-Ban Display smart glasses, only to have both fail. While Zuckerberg blamed the issue at the time on the building's Wi-Fi, the reason was something else entirely. Andrew Bosworth, Meta's chief technology officer, took to an Instagram Story to explain the reason. The first problem occurred when Zuckerberg had a chef use Meta AI and the Ray-Ban Display's camera to look at the ingredients and suggest a recipe. Meta AI first failed to respond, and then jumped around, seemingly ignoring the chef's commands. "When the chef said 'hey Meta, start Live AI,' it started every single Meta Ray-Ban's Live AI in the building, and there was a lot of people in the building" Bosworth said in the Instagram AMA. "Obviously, in rehearsal, we didn't have as many people in the building." "The second thing is, we had routed Live AI traffic to our dev server in theory to isolate it, but we had done it for everyone in that building on those access points, which included all of those headsets. We DDoS'd ourselves, basically." "And, it didn't happen in rehearsal because we hadn't had as many people with the glasses in the building." The second error came when Zuckerberg tried to take a WhatsApp video call from Bosworth. Despite Bosworth repeatedly calling Zuckerberg on the Ray-Ban Displays -- the incoming call tone was clearly heard by everyone in attendance -- Zuckerberg could not answer the call. "The video call issue was quite a bit more obscure," Bosworth said in the AMA. "A never-before-seen bug. The display had gone to sleep at the very instant the notification had come in that a call was coming in." Even when Mark woke the display back up, we didn't show the answer notification to him. We never ran into that bug before. That was the first time we'd ever seen it; it's fixed now, and that's a terrible place for that bug to show up." To his credit, Zuckerberg was able to quickly pivot and make light of both errors, but it serves as a reminder that when you're trying something new on stage for the first time, anything can happen.
[9]
'We DDoS'd ourselves': Meta explains why it's smart glasses demo failed so spectacularly - and says 'never-before-seen' video bug is now fixed
Meta Connect 2025 was packed with impressive smart glasses reveals, but also several iconic live demo fails. CTO Andrew Bosworth has since taken to Instagram to reveal what went wrong. The first demo issue involved a chef trying and failing to get the glasses to help him prepare a meal to showcase Live AI - an always-on version of Meta's AI that can give you continual contextual assistance by following your actions. After a few attempts to go to the next step of the process, the chef had to give up and Meta CEO Mark Zuckerberg had to carry on with the show, and put the blame on the Wi-Fi. Andrew Bosworth explained later that the Wi-Fi wasn't at fault, "When the chef said 'Hey Meta start Live AI' it started every single Ray-Ban Meta's Live AI in the building, and there was a lot of people in that building." He added that what made matters worse was "we had routed Live AI traffic to our dev server, in theory to isolate it, but we had done it for everyone in that building on those access points." Basically "We DDoS'd ourselves." A DDoS is a cyberattack that aims to overload a server by throwing as much traffic at it as possible from multiple sources so that it can't be accessed. Bosworth said that they had rehearsed the demo and it went smoothly, but that there were nowhere near as many Ray-Ban devices in the building to cause issues. What about the video call glitch that ruined Bosworth's intro to the show? This was "quite a bit more obscure" according to his post-show analysis. In what he described as "A never-before-seen bug in a new product", Bosworth said the cause of the issue was "The display had gone to sleep at the very instant the notification had come in that a call was coming." "And so it was a race condition which caused it that even when Mark woke the display back up we didn't show the answer notification to him." A race condition is a programming term for when multiple processes are being executed at the same time that rely on shared data. The processes typically aren't meant to be running at the same time, and have accidentally entered into a race to see which completes first - potentially altering the shared data and messing up whatever the other process was trying to do. In the demo, it sounds like the notifications and wake up feature were both trying to do different things with the display causing the onstage gaffe. Regardless of what went wrong, the demo was "the first time we had ever seen" the bug according to Bosworth, adding (while smiling) "it's fixed now." In other Instagram Stories, Andrew Bosworth said that while the demo failures were a bummer, they haven't convinced Meta to abandon live demos or caused much embarrassment. That's because lots of journalists - like our own Lance Ulanoff and Josephine Watson - have tried the glasses out and been very impressed with what they've seen. These "critics", as Bosworth refers to them, wouldn't be so positive if there wasn't positive stuff to say about the glasses. Speaking of which, Lance said of the Meta Ray-Ban Display glasses "based on my experience, nothing comes closer to effortlessly delivering information at a glance, and I'm starting to wonder if this is a glimpse of what will someday replace smartphones.
[10]
Meta's Disastrous Smart Glasses Demo Even Worse Than We Thought
Mark Zuckerberg's MetaConnect 2025 keynote on Wednesday quickly turned into a humiliating experience. The company's demos of its new artificial intelligence-powered smart glasses failed repeatedly, causing Zuckerberg to stammer his way through awkward silences. "This is, uh... it happens," the CEO stammered after his smart glasses refused to accept a WhatsApp video call on stage. "Let's try it again, I keep messing this up." Another demo involved content creator and amateur chef Jack Mancuso trying to get assistance from his AI glasses while cooking up a steak sauce. But the segment devolved into confusion as the "Live AI" feature assumed he was far more along in the process than he actually was, the kind of hallucination you'd expect from an AI assistant. "You already combined the base ingredients," the AI told Mancuso, who was sheepishly standing in front of an empty glass bowl. It was an embarrassing display, highlighting some glaring shortcomings with the company's efforts to infuse its Ray-Ban smart glasses with a heavy dose of AI. Afterward, in an ask-me-anything on Instagram, Meta's CTO Andrew Bosworth explained what went wrong, insisting that it was a "demo fail, not a product fail." "When the chef said, 'Hey Meta, start Live AI,' it started every single Ray-Ban Meta's Live AI in the building," he said. "And there was a lot of people in that building." "That obviously didn't happen in rehearsal," Bosworth said. "We didn't have as many things." It wasn't the only major blunder Meta encountered during its keynote. Since Meta routed all traffic to its "dev server," including from all of the headsets in the building, "we DDoS'd ourselves, basically," Bosworth admitted, referring to a common cyberattack strategy known as a "denial-of-service attack" that attempts to bring a network down by overwhelming it with phony internet traffic. Bosworth also attempted to explain why Zuckerberg's attempt to make a WhatsApp video call using the Meta Ray-Ban Display smart glasses completely failed. Apparently, it was due to a "never-before-seen bug" that Zuckerberg was unable to accept calls on his smart glasses. "You guys know we can do video calling," Bosworth pleaded on Instagram. "We got WhatsApp, we know how to do video calling." But should we really take the CTO's excuses at face value? Besides coping with the seemingly inevitable software bugs, there's a decent chance users of Meta's smart glasses will also run into a plethora of hallucinations -- a reality numerous AI gadget manufacturers have faced already. In other words, instead of being lied to by an AI on a desktop computer or smartphone, Meta is opening up the possibility of having a robotic voice mislead you straight through your smart glasses as well. Whether that kind of potential frustration is worth $379 for the regular smart glasses -- and $799 for a version of the Ray-Bans with a small screen that wearers can see in their vision -- remains to be seen. On the other hand, journalists who got the opportunity to try the glasses out for themselves appear to have been surprisingly impressed by the experience. So maybe Zuckerberg's disastrous keynote was just the result of poor planning after all.
[11]
Meta's Ray-Ban Display AI glasses stumble in onstage debut
A company doesn't get many "proof of concept" moments on a stage built for "legendary status." Meta just used up two of them in minutes. At Meta Connect this week, CEO Mark Zuckerberg tried to show the future of computing: his long-held (and somewhat maligned) idea that you won't look down at a phone so much as glance up through an invisible interface -- and the future glitched out in real time. First, a cooking assistant in the latest Ray-Ban glasses got confused in the middle of a recipe; then a marquee WhatsApp video call from Zuckerberg to chief technology officer Andrew Bosworth refused to surface on the new in-lens display. "Uh oh," a clearly uncomfortable Zuckerberg said on stage. "That's too bad. I don't know what happened." The company blamed the conference's Wi-Fi. The internet blamed Meta. Bosworth, for his part, delivered what was perhaps the most honest line of Meta Connect: The company "missed an opportunity for legendary status." The stumble stung because the pitch is audacious and, on paper, coherent. The Ray-Ban Meta Display glasses -- starting at $799 and shipping in the U.S. on Sept. 30 -- theoretically put a high-resolution panel in one lens and pair it with a neural wristband that reads any tiny electrical signals in your hand to control what you see. The demo reel is seductive: live subtitles; text and call notifications floating where people are looking; glanceable directions; translations on the fly. You can squint and see the outlines of an everyday habit, not just a gamer or cosplay headset. Well, that's the idea, anyway. When the calls the company planned as the "aha" moment don't appear on the heads-up display (HUD), it turns its inevitability narrative into a Rorschach test. Fans see a fixable glitch; skeptics see an expensive question mark. The connect demonstration was a "comedy of errors," one person wrote on X. Mashable wrote on X that the AI glasses failures were "SO awkward" and encouraged people to watch Zuckerberg's "painful live demo." Another social media user called the demo-day downfall "abysmal" and said the whole thing felt "rushed and scattered." A Redditor in the r/OculusQuest thread called the on-stage failure "solid comedy gold," while another in r/news said this "absolute disaster" was "possibly the worst technology demonstration I've ever seen." Meta insists the products work -- the bugs were "demo-gods" specials, the company said. Bosworth, in a series of Instagram stories, later offered specifics: The cooking bit triggered a DDoS-like overload -- "We DDoSed ourselves," he said -- when the assistant inadvertently woke many attendees' glasses at once; later, a rare sleep-state bug sent the display to nap precisely when the call arrived. "I know the product works, I know it has the goods," Bosworth said on social media. "It really was a demo fail and not a product failure." Engineers reportedly killed the hiccups after the fact, he said, but the meme had already minted itself. There's a narrow window when a company can convince people that a new device belongs in their lives. Blowing the shot on a global stage doesn't close the window -- but it sure fogs it up. Wall Street has been here with Meta before. "Trust us, this will be a thing" was also the argument for the Metaverse. Reality Labs has drained the company of staggering sums -- $17.7 billion in operating losses in 2024, after $16.1 billion in 2023 -- without producing any sort of breakout mass-market habit. And yet -- because with Meta there's often an "and yet" -- the glasses business isn't vapor. Meta owns the biggest chunk of the smart glasses market. The non-display Ray-Ban line is a bona fide success by smart-accessory standards: more than three-and-a-half million pairs since late 2023, with H1 2025 sales up more than 200% at Ray-Ban owner EssilorLuxottica. Those aren't iPhone numbers, but that does represent traction, the kind you can build from if the experience hardens into a habit. If Display's panel and the neural wristband make messaging, recording, and translation feel instant rather than fussy, the category has a shot at going from "toy" to "tool." And "everyday inevitability" is the path Meta keeps pointing at: not a VR escape pod, but something so entirely and completely boringly useful that you forget it's there. The hard part is that "boringly useful" demands... boring reliability. A demo is supposed to be something close to rigged -- scripted to show the device at its best, with all the duct tape and people running around with headsets hidden backstage. If Meta can't stage-manage reliability under the best-case, written-well-in-advance conditions, it might be fair for audiences to assume everyday life will be worse. And that's why the video-call connecting failure and the chef bit (standing over an empty bowl, with the AI telling him that he had already combined the base ingredients) could metastasize into something like: If this is the future, why does it trip over the present tense? Meta also faces an optics tax. The company is trying to sell a product in a world that still remembers the "glassholes," a not-so-flattering nickname people gave early Google Glass adopters (around 2013-14) who would speak loudly to their $1,500 frames. Meta has more privacy tooling now, but the trust gap is still wide. The Ray-Ban Display models have a forward-facing camera and, now, a heads-up panel. Critics worry that this makes it even harder for bystanders to know when they're being recorded or analyzed. The company added LED indicators to signal when the camera is on, but watchdogs have noted that tiny lights don't resolve the core concern: that people don't want to feel (even more) surveilled in daily life. One New York woman posted on TikTok recently that her waxer was wearing the Meta glasses, although the beautician reassured her client that "they're not charged, they're not on, like, I promise." The experience, the woman on TikTok said, has been "haunting her ever since." For Meta, the baggage is heavy: Its history of data-harvesting scandals leaves skeptics less likely to believe any corporate assurances about "responsible AI." For all the talk of AI and infrastructure, first impressions are still visual. If the device announces itself as experimental hardware, people will recoil. If it can pass as something ordinary -- sunglasses, not science fiction -- the glasses can earn a chance to be worn in public. The Ray-Ban styling is that clever form factor hack -- it looks like eyewear, not a dev kit. The most generous read on this week is that Meta chose to do the risky thing live, and the risk showed up in 60-point font on the screen. The less generous read is that Meta, a decade after buying Oculus, still hasn't learned how much credibility it burns every time the future asks for just one more -- We promise! -- do-over. Meta's answer here seems to be to go faster: lean into capex, reroute the model road map, and deploy fresh surfaces where AI can live as a constant companion. In that ever-optimistic worldview, Display isn't just a gadget; it's a way for the company to make the AI people are paying for visible, valuable, and daily. And... Display is a way to diversify where ad engines could eventually show up -- on faces, not just phones. Meta has pitched the glasses in the same breath as it talks nonstop about "superintelligence." A company probably doesn't say that part out loud unless it intends to be graded at least a little bit like a platform owner. The irony with the disastrous showcase is that the closest thing to a bull case here might just be hidden in the flop. Live-demo disasters are, historically, survivable when the underlying product is ready -- or, ready enough. Apple had the iPhone radio stack juggling act; Microsoft has had its fair share of onstage blue screens of death. What wins out, eventually (and for the most part), is the mundane -- the feeling that you can trust the thing... when you need it. For better or worse, Meta's glasses are closer to that threshold than the failed-demo jokes suggest. The industrial design is credible. The software, outside the glare of a keynote stream, increasingly is, too. The question is whether Meta, a company with a multibillionaire's tolerance for embarrassment and a sovereign fund's appetite for capex, can focus long enough on the dull work of making a feature never fail, rather than inventing three shiny distractions. So sure, the Wi-Fi was allegedly brutal. The clips were brutal, too. But the more useful frame is the credibility meter. Right now, Meta is asking consumers to accept a computer on their face and investors to underwrite a spend that would have seemed like satire 20 -- or maybe even 10 -- years ago. The first ask demands that trust be built in tiny, uneventful wins. The second ask demands proof that the money machine on Facebook and Instagram can subsidize that trust long enough to flip a culture; for now, it is. Meta has the cash and the stubbornness for both. What it doesn't have -- again: yet -- is a moment where the glasses disappear and the experience doesn't. Until it does, the company's biggest swings will keep landing as punchlines. Meta Connect promised the future of AI glasses. What connected most were the WiFi jokes. The fix here isn't another sizzle reel or expensively made, glossy, high-resolution commercial. The fix is a dull one: Zuckerberg says, "Let's try that call again," and, this time, the heads-up display just appears -- every single time. Then, the company can claim its product is something kind of close to "legendary." Until then, Meta's future will keep buffering.
[12]
Mark Zuckerberg laughs after the live demo of his new $800 smart glasses goes horribly wrong | Fortune
But Wednesday's grand unveiling was overshadowed by a live on‑stage demo that repeatedly failed, culminating in Mark Zuckerberg being unable to answer a video call via the new neural wristband while assuring the audience "it's all good" amid Wi‑Fi excuses. Meta is pushing a tiered family of AI glasses that range from camera‑ and audio‑first models to a new pair with an integrated transparent display, all centered on hands‑free capture, Meta AI assistance, and voice or wristband control. The Ray‑Ban Display adds a see‑through lens readout and relies on a neural wristband for subtle gesture control, marking Meta's first consumer smart glasses with a built‑in display. Meta AI powers voice queries, hands‑free photos and video, real‑time translation, and context‑aware assistance across the lineup, with the Display model extending glanceable interactions into the lens itself. A neural wristband enables subtle finger gestures for control on the Display glasses, and Meta also highlighted "conversation focus" audio processing to better hear voices in live environments. Early hands‑on coverage has been notably upbeat for the Display glasses, with one reviewer from The Verge calling them the best smart glasses tried to date and another saying they "feel like the future," while also noting they're the product to beat for the category. Broader coverage praised Gen 2's practical upgrades and battery gains, but also flagged the high‑profile live AI demo faltered on stage, tempering the otherwise strong showing. During a cooking segment, the glasses' live AI misinterpreted prompts, insisted base ingredients were already combined, and suggested steps for a sauce that hadn't been started before the host punted back to Zuckerberg citing Wi‑Fi issues, prompting his "it's all good" reassurance to a laughing crowd. "The irony of all this whole thing is that you spend years making technology and then the Wi-Fi on the day kinda... catches you," Zuckerberg said, laughing. "We'll go check out what he made later." Later, while wearing Ray‑Ban Meta glasses and the neural wristband, Zuckerberg repeatedly failed to answer an incoming video call on stage despite multiple attempts, eventually giving up as the ringtone continued, with other outlets noting similar struggles during the event. Fortune has reported on Meta's broader smart‑device roadmap, including a "Hypernova" pair of smart glasses expected to use a wristband controller akin to the company's ambitious Orion AR project, underscoring Meta's long‑term bet on neural wrist interfaces across its wearables. That wrist‑first interaction model mirrors the neural-band approach Meta just showcased for the Ray‑Ban Display, suggesting strategic consistency between near‑term products and pipeline devices.
[13]
Mark Zuckerberg's smart glasses demo goes wrong
A launch event for Meta's new artificial intelligence (AI) glasses left Mark Zuckerberg red-faced on Wednesday as glitches disrupted his on-stage demonstration. Attempts to show off hands-free calling through the $800 (£586) Meta Ray-Ban Display glasses were aborted after the billionaire Facebook founder failed to take a video call three times. In another segment, Jack Mancuso, a chef and influencer, asked his AI glasses to help with a recipe for a Korean steak sauce. However, the AI chatbot built into the smart glasses failed to answer his questions. Following the mishap, which took place at a launch event in California, Mr Zuckerberg told the audience: "I don't know what to tell you guys. We will go to the next thing and hope that will work. "The irony of the whole thing is you spend years making technology and the Wi-Fi of the day kind of catches you [out]."
[14]
'It's all good, it's all good' says Mark Zuckerberg as his catastrophic live demo of Meta's new smart glasses goes horribly wrong: 'You spend years making technology and then the Wi-Fi on the day catches you'
Never work with children or animals, so goes the famous advice for stage performers, entertainers, and anyone having to present something live in front of an audience. Perhaps smart glasses should be added to that list, if yesterday's Meta Connect 2025 livestream is anything to go by -- as the new Ray-Ban Meta glasses took the opportunity to misbehave at almost ever turn. Unveiled to great fanfare and a rapt audience by Meta CEO Mark Zuckerberg, the second-generation smart glasses were promised to be capable of "empowering people with new abilities" and said to allow users to "make themselves smarter" thanks to the newly-polished AI functionality. Cool stuff, but when it came to the demos, things didn't exactly go to plan. Switching over to a livestreamed demonstration, displayed to the audience on a gigantic panel at Zuckerberg's side, chef Jack Mancuso attempted to use the Live AI functionality of the smart specs to help them make a Korean-inspired steak sauce. Standing in front of multiple unprepared ingredients (which the glasses appeared to initially recognise) the Meta specs immediately ignored a prompt to help make the sauce, instead listing the ingredients that might go in it. Interrupting, Mancuso asked the AI a reasonable question: "What do I do first?" After a long silence, in which metaphorical pins could be heard dropping in the audience, Mancuso asked again. The AI then merrily informed our now visibly-nervous host that the base of the sauce was already made. Another long pause. "What do I do first?" Mancuso asked once more, to laughter from the audience. "You've already combined the base ingredients," the AI continued, helpfully telling Mancuso to grate a pear into the non-existent sauce. "Alright, I think the Wi-Fi might be messed up" said Mancuso, looking embarrassed. "Back to you Mark." "It's all good, it's all good" said Zuckerberg, amid cheers, laughter, and applause from the crowd. "The irony of all this whole thing is that you spend years making technology and then the Wi-Fi on the day kinda... catches you. We'll go check out what he made later." Still, more demos were yet to come. Later in the presentation, Zuckerberg donned a pair of Ray-Ban Meta glasses himself, along with a wristband interface said to be able to control the glasses through muscle movements. "Now, I want to get into this in more detail, we've got two options," said Zuckerberg, laughing nervously. "We've got the slides, or we've got the live demo." At this point, the audience erupted into shouts for the live version, unsurprisingly, along with more laughter. Guess how it went. Zuckerberg was able to respond to a video call request with text via hand movements, which is fairly impressive, but was unable to answer the call itself. "Uh-oh," said the Meta CEO, frantically rubbing his fingers together in an attempt to pick up the line. "Well, I... let's see what happened there. That's too bad. I don't know what happened. Maybe Boz can try calling me again." Nope. Despite multiple attempts, Zuckerberg was left standing on stage twiddling his fingers, as the Meta AI voice digitally crunched to tell him yet another call was incoming that he seemed unable to answer. At one point, Zuckerberg blamed himself for the inability to control the device, but the ringtone continued to play across a deathly-silent hall, despite his best efforts. Eventually, the Meta head honcho gave up. "I don't know what to tell you guys," he said, eventually resorting to bringing the now much-awaited Boz onstage, amid a seemingly ever-present ringtone and much tittering from the crowd. I'll be honest, it's a pretty painful watch. As tempting as it is to make fun of Meta's multi-billionaire CEO for the borked demo, those of us who have had to present live ourselves will have our head in our hands, as I have while writing this article. It seems the Ray-Ban Meta glasses could do with some work, and while the tech looks very impressive on paper, the demo appears to have revealed some serious flaws in the implementation. I can't imagine many will be rushing to order a pair after this particular demonstration, but I can't help but think of another old showbiz cliché: There's no such thing as bad PR. We're all talking about it at least, and that's the main thing, eh Zuck?
[15]
Mark Zuckerberg Humiliated as AI Glasses Debut Fails in Front of Huge Crowd
On Wednesday, Meta CEO Mark Zuckerberg unveiled a slew of new augmented reality glasses, including what he claimed to be the "first AI glasses with high resolution," a new $799 version of its Meta Ray-Ban smart glasses that features a tiny screen that's viewable to the wearer. But it didn't take long for the company's MetaConnect 2025 keynote to descend into chaos. The social media giant's demos repeatedly failed, leading to awkward stares, deafening silences, and muted laughter. The poor showing painfully demonstrates that the tech is far from ready, even as companies continue to shove AI into every aspect of our daily lives. The stakes are high. Meta is spending tens of billions of dollars to build out infrastructure and hire industry-leading staff to support AI. Zuckerberg has also repeatedly doubled down on smart glasses being the future of the company, as well as AI-powered "superintelligence" as a whole. "This is one of those special moments where we get to show you something we've poured our lives into," he told the crowd at this week's event. Yet getting the tech to work on stage in front of a huge crowd proved too much, demonstrating once again that there's still a glaring gap between the AI industry's breathless promises and cold, hard reality. According to Zuckerberg's vision of the near future, wearers of Meta's glasses can converse with an AI chatbot to tell them what they're looking at -- or how to do things, like coaching on how to cook a dish. "Let's try it! It's not something I've made before," food content creator Jack Mancuso told Zuckerberg enthusiastically, after the CEO challenged him to make a steak sauce with the help of a new feature called "Live AI." "Can you help me create a Korean-inspired steak sauce?" Mancuso asked his glasses. "What do I do first?" Mancuso interjected after the robotic voice started making suggestions. "What do I do first?" the influencer repeated after several seconds of total silence that followed. "You already combined the base ingredients," the AI told Mancuso, who was standing in front of an empty glass bowl that he hadn't touched yet. A separate attempt by Zuckerberg to make a video call with his glasses ended with him awkwardly trying to explain why it wasn't working. Is this really all Meta has to show for it at this point? If so, the company still has an immense amount to to show if it wants to justify its enormous spending spree.
[16]
Meta Ray-Ban Display glasses glitch twice during live demo
"We're just gonna go to the next thing that we wanted to show and hope that will work," CEO Mark Zuckerberg said after a glitch cut off Meta's big demo, encompassing some stark moments of tension during the tech giant's event. In a kickoff of the company's two-day Meta Connect event at the company's headquarters on Wednesday, Zuckerberg shared the latest updates to Meta's AI glasses. But during his keynote address, Meta's AI glasses glitched not once -- but twice. The first time was when Zuckerberg was introducing an improved version of Live AI. He said users can now use the feature "for about an hour or two straight." To show how this works, Zuckerberg introduced cooking content creator Jack Mancuso to do a live demo. Mancuso asked Live AI to help him make a Korean-inspired steak sauce. The tech began to glitch after he asked what step he should do first, with the AI glasses jumping ahead in the recipe by several steps. After Mancuso repeated his question, the AI glitched again. After laughs from the crowd, Mancuso said that the "WiFi might be messed up" before throwing it back to Zuckerberg. The second glasses-related glitch happened while Zuckerberg was showing off the Meta Ray-Ban Display to the crowd. The new AI glasses use a Meta Neural Band, which Zuckerberg said replaces the "keyboard, mouse, touchscreen, buttons, dials" with the "ability to send signals from your brain with little muscle movements" that the band would pick up so users can "silently control" the glasses with "barely perceptible movements." Zuckerberg emphasized that the new glasses aren't "a prototype"; they're "ready to go." He said people will be able to buy the glasses, which start at $799, "in a couple weeks." As he continued to amp up the crowd about Meta's AI glasses, Zuckerberg presented two options: slides or a live demo. "Let's do it live," Zuckerberg said. He kicked the demo off by showing the room how the glasses can send and receive messages. Zuckerberg successfully received a message from Meta chief technology officer Andrew Bosworth, and Zuckerberg sent a message back in response. But the demo took a turn when Bosworth attempted to video call Zuckerberg -- to no avail. "I promise you, no one is more upset about this than I am," Bosworth said. "This is my team that now has to go debug why this didn't work on the stage." Meta Connect promised the future of AI glasses. Instead, it offered a seemingly bad internet connection.
[17]
Meta explains how its 'next big thing' fell flat on its face in front of everyone
TL;DR: MetaConnect 2025 faced multiple live demo failures, including Zuckerberg's AI glasses call and Meta AI's cooking assistance. Meta CTO Andrew Bosworth clarified these were demo issues, caused by server overload and an unprecedented bug, not product flaws, highlighting challenges in showcasing cutting-edge AI technology live. The MetaConnect 2025 keynote conducted on Wednesday gained quite a lot of attention for all the wrong reasons, as multiple demos failed in front of live audiences. Now, Meta's chief Technology Officer, Andrew Bosworth, has explained what happened behind the scenes to cause the failures. For those who haven't seen the video circulating online, Meta CEO Mark Zuckerberg attempted to make a WhatsApp video call with Meta's next-generation AI glasses on stage, but was unable to due to a bug. Zuckerberg said, "This, uh... it happens." Adding, "Let's try it again. I keep messing this up." That wasn't the only demo that failed during the event; another involved content creator and chef Jack Mancuso, who asked Meta AI to help him make a steak sauce out of the ingredients he had in front of him. Meta AI went off script and was unable to fulfill the request. "You already combined the base ingredients," the AI told Mancuso, who was looking at an empty bowl and a table full of ingredients. Now, Meta's Chief Technology Officer, Andrew Bosworth, has explained the failures of the demos in a recent Ask Me Anything (AMA) on Instagram, which can be viewed here. Bosworth states it wasn't a failure of the product itself, but more so a demo failure. Bosworth says that when Mancuso said, "'Hey Meta, start Live AI,' it started every single Ray-Ban Meta's Live AI in the building," and since every single pair of Meta glasses was connected to the same developer server it crashed the server. Essentially, "We DDoS'd ourselves, basically," said Bosworth. The Meta CTO also explained the failure with Zuckerberg's demo, saying that a "never-before-seen bug" presented itself during the demo. "The display had gone to sleep at the very instant the notification had come in that a call was coming." Bosworth added, "So even when Mark woke the display back up we didn't show the answer notification to him. We have never run into that bug before."
[18]
Almost Everything Went Wrong in Zuckerberg's Meta Connect Keynote. Why His Awkward Response Was a Huge Win
If you watched Mark Zuckerberg's Meta Connect keynote, you already know it didn't go according to plan. The company introduced an updated model of its Ray-Ban Meta glasses, expanded the Oakley sports line, and announced Ray-Ban Meta Display, a version that is exactly as it sounds -- Ray-Ban Meta glasses with a small display in the right lens. It comes with a neural wristband that promises to let you control devices with subtle finger movements. The event was supposed to be the moment Meta proved its vision of everyday wearable AI. Instead, almost everything went wrong. "We've got the slides, or we've got the live demo," Zuckerberg said, asking the audience which they preferred. On the one hand, you could definitely make the case that Zuckerberg should have just gone with the slides. On the other hand, I'm kind of glad he just went for it, despite the fact that at least two of the demos failed in spectacular ways. In an age where the tech keynote has mostly become a highly produced infomercial, it's refreshing to see companies attempt to demonstrate their products live on stage. Last week, for example, Apple announced its latest iPhone lineup with a 70-plus minute video. Everything about the event was perfect. The problem is that perfect often means it doesn't feel authentic. In this case, the demos were far from perfect. What mattered more, however, was how Zuckerberg responded. Sure, his reaction was awkward, but it was also an example of why leaders should take risks in public. The first failure came during a cooking demo. Meta brought in chef Jack Mancuso to show how the glasses could act as a real-time kitchen assistant. Mancuso asked for help making a Korean-inspired steak sauce. Instead of guiding him step by step, the AI stalled, repeated instructions, and got confused. Finally, Mancuso looked up and said: "All right, I think the Wi-Fi might be messed up. Sorry, back to you, Mark." "It's all good," Zuckerberg said. "You know what? It's all good. The irony is you spend years making technology and then the Wi-Fi at the day kind of catches you." Later, it got worse. Zuckerberg strapped on the Meta Neural Band, making a point of how he could control it with just the slightest of gestures. On stage, he tried to answer a call from Meta's CTO Andrew Bosworth. Nothing happened. He tried again. Still nothing. On the third attempt, the call failed once more. "Uh-oh. That's too bad. I don't know what to tell you guys," Zuckerberg said, before quipping: "It's really live. That's how we prove it's live." By the fourth try, he just laughed: "I don't learn. I don't learn." Eventually, he gave up, admitted the system would have to be debugged, and moved on. "You practice these things like a hundred times, and then you never know what's going to happen. No one is more upset about this than I am." For a company pitching the next era of computing, it was not a great look. For the rest of us, it was a reminder that live demos are brutally unforgiving. Of course, that's what makes this keynote important. In an era where most tech events are glossy commercials stitched together in post-production, Zuckerberg did it live. That's risky. It's also the right choice. Apple set the standard for pandemic-era events with cinematic videos that leave nothing to chance. They're beautiful and persuasive, but also kind of sterile. The company even faced criticism over its WWDC keynote last year, which showed Apple Intelligence features that still haven't shipped. If it had been doing the demos live, it probably would have made a different choice about those demos. Zuckerberg chose the opposite. He showed actual hardware in front of a live audience. The upside is obvious: credibility. Of course, the downside is exactly what happened -- when the technology fails, everyone sees it. Still, I think that's better than hiding behind video reels. More leaders should follow his example. It's one thing to promise the future. It's another to let people see how close -- or far -- you really are. I don't think Zuckerberg handled this flawlessly. Blaming the Wi-Fi was convenient, but unconvincing. The real issue is that these systems aren't quite ready for prime time. That doesn't mean they won't work, it just means they aren't a lock. They aren't perfectly reliable. Still, Zuckerberg didn't pretend that nothing happened. He acknowledged the failures. He kept his composure. He even laughed at himself. What he didn't do -- and what would have made the moment great -- was fully lean into the failure. Imagine if he had stopped and said: "This is exactly why we do these live. It's risky, and sometimes it doesn't work. But this is what building the future looks like." That would have turned the glitch into a strength instead of a distraction. Even so, his choice to keep going, admit frustration, and make light of it was the right move. For someone often criticized for seeming robotic, it was a rare moment of human awkwardness that actually worked in his favor. Zuckerberg took a risk by showing technology that still depends on shaky Wi-Fi, unpredictable AI outputs, and a neural wristband that sometimes misses your gesture. That risk created awkward moments, but it also made the keynote feel real. If you want to know whether a company believes in its product, watch whether it's willing to let you see it unedited. Perfection convinces people for a day. Authenticity builds trust over time. Like this column? Sign up to subscribe to email alerts and you'll never miss a post. The opinions expressed here by Inc.com columnists are their own, not those of Inc.com.
[19]
Munster Says Meta Will 'Get Heat' For AI Glasses Demo Glitches But Praises Mark Zuckerberg For 'Doing It Live' At Connect 2025: Here's What Happened - Meta Platforms (NASDAQ:META)
On Wednesday, at Meta Platforms, Inc.'s META Connect 2025 event, Mark Zuckerberg's live demos of new AI-powered glasses stumbled twice onstage, prompting analyst Gene Munster to warn of backlash while applauding the CEO's willingness to showcase products in real time. Live Cooking Demo Goes Off Script Zuckerberg invited food creator Jack Mancuso onstage to demonstrate how the upgraded Ray-Ban Meta smart glasses could assist with cooking. Mancuso asked the AI for step-by-step instructions to make a Korean-inspired steak sauce. Instead of responding directly, the glasses repeated lines about soy sauce and sesame oil, skipping over the basics. Mancuso joked that Wi-Fi was to blame, handing the stage back to Zuckerberg as the audience offered encouragement. "The irony of the whole thing is that you spend years making technology and then the WiFi at the day catches you," Zuckerberg said. See Also: Gene Munster Says Meta's $65 Billion AR/VR Bet Shows Hardware, Software, AI Integration Is 'Really Hard,' Giving Apple More Time On AI Ray-Ban Display Demo Falters The second stumble came during a demo of the Meta Ray-Ban Display, which adds a heads-up display to show notifications and navigation. Using a neural wristband, Zuckerberg tried repeatedly to answer a video call from Meta CTO Andrew Bosworth. The hand motions failed until Bosworth appeared onstage to help. "This WiFi is brutal," Bosworth said. Zuckerberg added, "You practice these things like 100 times, and then, you never know what's going to happen." Munster Reacts To Demo 'Bombs' Munster, managing partner at Deepwater Asset Management, weighed in on social media after the keynote. "$META will get heat for the demo bombs," he wrote. "I applaud Zuck for 'doing it live.' Bill O'Reilly must be proud." High-Profile Demo Failures Are Not New Meta is not alone in facing awkward onstage moments. In 2019, Tesla's Cybertruck reveal turned infamous when its "armored" windows shattered during a demo. Back in 2010, Steve Jobs struggled with Wi-Fi during an iPhone 4 presentation. In 2023, Google faced backlash after its Bard chatbot delivered an incorrect answer at launch. Such slip-ups can fuel skepticism but also show the challenges of integrating emerging technology into consumer-ready products. Price Action: Meta shares slipped 0.42% to $775.72 on Wednesday but edged up 0.71% in pre-market trading on Thursday, according to Benzinga Pro. Benzinga's Edge Stock Rankings show that META continues on an upward trend across short, medium and long-term horizons, with further performance insights available here. Read Next: Apple May See Fewer Searches In Safari, But Google CEO Sundar Pichai Insists AI Is Fueling Overall Query Growth: 'Far From A Zero-Sum Game' Disclaimer: This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors. Photo courtesy: Frederic Legrand - COMEO / Shutterstock.com METAMeta Platforms Inc$783.250.97%Stock Score Locked: Edge Members Only Benzinga Rankings give you vital metrics on any stock - anytime. Unlock RankingsEdge RankingsMomentum83.93Growth85.53Quality94.03Value23.55Price TrendShortMediumLongOverviewMarket News and Data brought to you by Benzinga APIs
[20]
Mark Zuckerberg has Wi-Fi glitch during live demo of Meta's new $800...
Meta CEO Mark Zuckerberg stumbled through the live debut of the company's latest line of Ray-Ban smart glasses on Wednesday, as technical glitches derailed his on-stage demo and undercut the launch of the $800 flagship model. The Facebook founder's big demo at Meta Connect 2025 went off the rails Wednesday when the $800 Ray-Ban Display glasses repeatedly misfired on stage, forcing Zuckerberg to blame "bad Wi-Fi" as the audience laughed. Zuckerberg was on stage and joined virtually by chef Jack Mancuso, who asked the glasses' AI for step-by-step help making a Korean-inspired steak sauce. Instead, the AI wrongly assumed ingredients had already been combined, jumped steps and repeated the error when he tried to restart. "The irony of all this whole thing is that you spend years making technology and then the Wi-Fi on the day kinda... catches you," Zuckerberg chuckled, telling the crowd "it's all good." It wasn't. Later, when he tried to answer a video call using a neural wristband paired with the glasses, Zuckerberg failed multiple times before giving up -- while the ringtone droned on in front of hundreds of attendees and streaming viewers. The flop overshadowed what was meant to be Meta's splashy unveiling of its three-tiered lineup: the flagship Ray-Ban Display ($799), an upgraded Ray-Ban Meta Gen 2 ($379), and a sport-oriented Oakley Meta Vanguard ($499). The devices promise hands-free photos, real-time translation, and Meta AI-powered assistance, with the Display model featuring the company's first consumer lens-integrated screen. All three go on sale Sept. 30. Despite Zuckerberg's stage struggles, early reviewers praised the Display glasses as the best of their kind, with one tech writer saying they "feel like the future." The $799 Ray-Ban Display is Meta's first pair of consumer glasses with a see-through, high-resolution display built directly into the lens -- bright enough to shine at 5,000 nits and sharp at 42 pixels per degree. The tiny screen can beam texts, maps, and images into your field of view, with live captions and real-time translation popping up like sci-fi subtitles. Instead of buttons or voice prompts, the glasses are controlled by a new "Neural Band" wristband that reads muscle signals from subtle finger movements. Battery life runs about six hours on mixed use, and the glasses come with Transitions® lenses so they work indoors and out. The $379 Ray-Ban Meta Gen 2 upgrades the original model with a 12-megapixel camera capable of 3K video at 60FPS, a doubled battery life (up to eight hours, with 48 more in the case), and classic Ray-Ban frames in new colors like Cosmic Blue and Mystic Violet. Meta AI now "sees" through the glasses, offering real-time directions, conversation mode for noisy environments, and on-the-spot translation. The $499 Oakley Meta Vanguard is the sport model -- rugged enough for sweat and water, with an ultra-wide 12-megapixel camera, five-mic array, louder speakers, and quick charging that hits 50% in 20 minutes.
Share
Share
Copy Link
Meta's live demonstrations of new smart glasses technology encountered significant issues at the Meta Connect event. CTO Andrew Bosworth later explained the technical reasons behind the failures, which were not related to Wi-Fi as initially suggested.
Meta's annual Connect conference, unveiling new smart glasses like upgraded Ray-Ban Meta and the Meta Ray-Ban Display, faced technical difficulties during live demonstrations. Multiple demo failures visibly challenged CEO Mark Zuckerberg and his team
1
2
.Source: Futurism
Two main demos failed. Jack Mancuso's use of Ray-Ban Meta's Live AI for recipe guidance failed due to unresponsiveness
1
3
. Mark Zuckerberg also couldn't answer a WhatsApp video call from CTO Andrew Bosworth on the new smart glasses via the neural band controller4
5
.Source: New York Post
CTO Andrew Bosworth later clarified issues on Instagram, correcting initial Wi-Fi theories
1
3
.Source: TechRadar
1
3
.1
3
.Related Stories
Public mishaps questioned Meta's smart glasses readiness
2
. Meta commits to its "personal superintelligence" vision through AI eyewear. WhatsApp call bug is fixed, and product refinement continues, with Meta confident in its technology's potential1
3
5
.Summarized by
Navi
[1]
[2]