Vista Normal

Hay nuevos artículos disponibles. Pincha para refrescar la página.
AnteayerSalida Principal

Bats Can No Longer Haunt Apple VR Headsets Via Web Exploit

26 Junio 2024 at 11:00

Bug reporting doesn’t usually have a lot of visuals. Not so with the visionOS bug [Ryan Pickren] found, which fills a user’s area with screeching bats after visiting a malicious website. Even better, closing the browser doesn’t get rid of them! Better still? Doesn’t need to be bats, it could be spiders. Fun!

The bug has been fixed, but here’s how it worked: the Safari browser build for visionOS allowed a malicious website to fill the user’s 3D space with animated objects without interaction or permission. The code to trigger this is remarkably succinct, and is actually a new twist on an old feature: Apple AR Quick Look, an HTML-based feature for rendering 3D augmented reality content in Safari.

How about spiders, instead?

Leveraging this old feature is what lets an untrusted website launch an arbitrary number of animated 3D objects — complete with sound — into a user’s virtual space without any interaction from the user whatsoever. The icing on the cake is that Quick Look is a separate process, so closing Safari doesn’t get rid of the pests.

Providing immersive 3D via a web browser is a valuable way to deliver interactive content on both desktops and VR headsets; a good example is the fantastic virtual BBC Micro which uses WebXR. But on the Apple Vision Pro the user is always involved and there are privacy boundaries that corral such content. Things being launched into a user’s space in an interaction-free way is certainly not intended behavior.

The final interesting bit about this bug (or loophole) was that in a way, it defied easy classification and highlights a new sort of issue. While it seems obvious from a user experience and interface perspective that a random website spawning screeching crawlies into one’s personal space is not ideal, is this a denial-of-service issue? A privilege escalation that technically isn’t? It’s certainly unexpected behavior, but that doesn’t really capture the potential psychological impact such bugs can have. Perhaps the invasion of personal space and user boundaries will become a quantifiable aspect of bugs in these new platforms. What fun.

DIY Eye and Face Tracking for the Valve Index VR Headset

20 Mayo 2024 at 05:00

The Valve Index VR headset has been around for a few years now. It doesn’t come with eye or face tracking, but that didn’t stop inspired folks like [Physics-Dude] from adding DIY solutions in elegant and effective ways using a combination of hardware, open software, and 3D printable parts.

The whole assembly integrates tightly, thanks in part to the “frunk” designed into the Index for exactly this kind of thing.

This project leverages the EyeTrackVR project (and optionally, Project Babble for mouth tracking) which both have great applications particularly in social VR spaces.

These are open-source, self-contained and modular solutions intended for a variety of hardware platforms. Of course, every millimeter and gram tends to count when it’s something that gets worn on one’s head, so [Physics-Dude] tailored a solution specifically for the Valve Index. His project makes great use of the platform’s hacker-friendly hardware design.

[Physics-Dude] also makes excellent use of a certain widely-available “gumstick” style USB hub as an important part of his build. Combined with with the front-mounted USB port on the Index, it results in an extremely compact and tightly integrated solution that looks great. While it can be risky to rely on a particular off-the-shelf item in a build, doing so absolutely has its place here.

The documentation is fantastic, including welcome guidance on cable routing and step-by-step instructions. If you’ve been interested in adding eye tracking to a project, be sure to give it a look. Already have eye tracking in a project of your own? Tell us all about it!

A Master-Class On Reverse-Engineering Six AR Glasses

12 Mayo 2024 at 08:00
Two pictures of the same black dog, wearing two separate pairs of the AR glasses reviewed in these two articles

Augmented reality (AR) tech is getting more and more powerful, the glasses themselves are getting sleeker and prettier, and at some point, hackers have to conquer this frontier and extract as much as possible. [Void Computing] is writing an open source SDK for making use of AR glasses, and, along the way, they’ve brought us two wonderful blog posts filled with technical information laid out in a fun to read way. The first article is titled “AR glasses USB protocols: the Good, the Bad and the Ugly”, and the second one follows as “the Worse, the Better and the Prettier”.

Have you ever wanted to learn how AR glasses and similar devices work, what’s their internal structure, which ones are designed well and which ones maybe not so much? These two posts have concise explanations, more than plenty of diagrams, six case studies of different pairs of AR glasses on the market, each pair demonstrated by our hacker’s canine assistant.

[Void Computing] goes in-depth on this tech — you will witness MCU firmware reverse-engineering, HID packet captures, a quick refresher on the USB-C DisplayPort altmode, hexdumps aplenty, and a reminder on often forgotten tools of the trade like Cunningham’s law.

If reverse-engineering lights your fire, these high-level retrospectives will teach you viable ways to reverse-engineer devices in your own life, and they certainly set a high bar for posts as far as write-ups go. Having read through these posts, one can’t help but think that some sort of AR glasses protocol standard is called for here, but fortunately, it appears like [Void Computing]’s SDK is the next best thing, and their mission to seize the good aspects of a tentative cyberpunk future is looking to be a success. We’ve started talking about AR glasses over a decade ago, and it’s reassuring to see hackers catching up on this technology’s advancements.

We thank [adistuder] for sharing this with us on the Hackaday Discord server!

Here’s How That Disney 360° Treadmill Works

4 Mayo 2024 at 05:00

One thing going slightly viral lately is footage of Disney’s “HoloTile” infinite floor, an experimental sort of 360° treadmill developed by [Lanny Smoot]. But how exactly does it work? Details about that are less common, but [Marques Brownlee] got first-hand experience with HoloTile and has a video all about the details.

HoloTile is a walking surface that looks like it’s made up of blueish bumps or knobs of some kind. When one walks upon the surface, it constantly works to move its occupant back to the center.

Whenever one moves, the surface works to move the user back to the center.

Each of these bumps is in fact a disk that has the ability spin one way or another, and pivot in different directions. Each disk therefore becomes a sort of tilted wheel whose edge is in contact with whatever is on its surface. By exerting fine control over each of these actuators, the control system is able to create a conveyor-belt like effect in any arbitrary direction. This can be leveraged in several different ways, including acting as a sort of infinite virtual floor.

[Marques] found the system highly responsive and capable of faster movement that many would find comfortable. When walking on it, there is a feeling of one’s body moving in an unexpected direction, but that was something he found himself getting used to. He also found that it wasn’t exactly quiet, but we suppose one can’t have everything.

How this device works has a rugged sort of elegant brute force vibe to it that we find appealing. It is also quite different in principle from other motorized approaches to simulate the feeling of walking while keeping the user in one place.

The whole video is embedded just below the page break, but if you’d like to jump directly to [Marques] explaining and showing exactly how the device works, you can skip to the 2:22 mark.

❌
❌