Vista Normal

Hay nuevos artículos disponibles. Pincha para refrescar la página.
AnteayerSalida Principal

Dog Poop Drone Cleans Up the Yard So You Don’t Have To

28 Septiembre 2024 at 23:00

Sometimes you instantly know who’s behind a project from the subject matter alone. So when we saw this “aerial dog poop removal system” show up in the tips line, we knew it had to be the work of [Caleb Olson].

If you’re unfamiliar with [Caleb]’s oeuvre, let us refresh your memory. [Caleb] has been on a bit of a dog poop journey, starting with a machine-learning system that analyzed security camera footage to detect when the adorable [Twinkie] dropped a deuce in the yard. Not content with just knowing when a poop event has occurred, he automated the task of locating the packages with a poop-pointing robot laser. Removal of the poop remained a manual task, one which [Caleb] was keen to outsource, hence the current work.

The video below, from a lightning talk at a conference, is pretty much all we have to go on, and the quality is a bit potato-esque. And while [Caleb]’s PoopCopter is clearly still a prototype, it’s easy to get the gist. Combining data from the previous poop-adjacent efforts, [Caleb] has built a quadcopter that can (or will, someday) be guided to the approximate location of the offending package, home in on it using a downward-looking camera, and autonomously whisk it away.

The retrieval mechanism is the high point for us; rather than a complicated, servo-laden “sky scoop” or something similar, the drone has a bell-shaped container on its belly with a series of geared leaves on the open end. The leaves are open when the drone descends onto the payload, and then close as the drone does a quick rotation around the yaw axis. And, as [Caleb] gleefully notes, the leaves can also open in midair with a high-torque yaw move in the opposite direction; the potential for neighborly hijinx is staggering.

All jokes and puns aside, this looks fantastic, and we can’t wait for more information and a better video. And lest you think [Caleb] only works on “Number Two” problems, never fear — he’s also put considerable work into automating his offspring and taking the awkwardness out of social interactions.

Mothbox Watches Bugs, So You — Or Your Grad Students — Don’t Have To

19 Septiembre 2024 at 11:00

To the extent that one has strong feelings about insects, they tend toward the extremes of a spectrum that runs from a complete fascination with their diversity and the specializations they’ve evolved to exploit unique and ultra-narrow ecological niches, and “Eww, ick! Kill it!” It’s pretty clear that [Dr. Andy Quitmeyer] and his team tend toward the former, and while they love their bugs, spending all night watching them is a tough enough gig that they came up with Mothbox, the automated insect monitor.

Insect censuses are valuable tools for assessing the state of an ecosystem, especially insects’ vast numbers, short lifespan, and proximity to the base of the food chain. Mothbox is designed to be deployed in insect-rich environments and automatically recognize and tally the moths it sees. It uses an Arducam and Raspberry Pi for image capture, plus an array of UV and visible LEDs, all in a weatherproof enclosure. The moths are attracted to the light and fly between the camera and a plain white background, where an image is captured. YOLO v8 locates all the moths in the image, crops them out, and sends them to BioCLIP, a vision model for organismal biology that appears similar to something we’ve seen before. The model automatically sorts the moths by taxonomic features and keeps a running tally of which species it sees.

Mothbox is open source and the site has a ton of build information if you’re keen to start bug hunting, plus plenty of pictures of actual deployments, which should serve as nightmare fuel to the insectophobes out there.

Olympic Sprint Decided By 40,000 FPS Photo Finish

17 Agosto 2024 at 20:00
40,000 FPS Omega camera captures Olympic photo-finish

Advanced technology played a crucial role in determining the winner of the men’s 100-meter final at the Paris 2024 Olympics. In a historically close race, American sprinter Noah Lyles narrowly edged out Jamaica’s Kishane Thompson by just five-thousandths of a second. The final decision relied on an image captured by an Omega photo finish camera that shoots an astonishing 40,000 frames per second.

This cutting-edge technology, originally reported by PetaPixel, ensured the accuracy of the result in a race where both athletes recorded a time of 9.78 seconds. If SmartThings’ shot pourer from the 2012 Olympics were still around, it could once again fulfill its intended role of celebrating US medals.

Omega, the Olympics’ official timekeeper for decades, has continually innovated to enhance performance measurement. The Omega Scan ‘O’ Vision Ultimate, the camera used for this photo finish, is a significant upgrade from its 10,000 frames per second predecessor. The new system captures four times as many frames per second and offers higher resolution, providing a detailed view of the moment each runner’s torso touches the finish line. This level of detail was crucial in determining that Lyles’ torso touched the line first, securing his gold medal.

This camera is part of Omega’s broader technological advancements for the Paris 2024 Olympics, which include advanced Computer Vision systems utilizing AI and high-definition cameras to track athletes in real-time. For a closer look at how technology decided this historic race, watch the video by Eurosport that captured the event.

❌
❌