Vista Normal

Hay nuevos artículos disponibles. Pincha para refrescar la página.
AnteayerSalida Principal

An Adaptive Soundtrack for Bike Tricks

22 Junio 2025 at 08:00
A man is shown performing a wheelie on a red bicycle in a classroom. In the background, a projector is displaying a phone screen running an indistinct app.

If you’ve put in all the necessary practice to learn bike tricks, you’d probably like an appropriately dramatic soundtrack to accompany your stunts. A team of students working on a capstone project at the University of Washington took this natural desire a step further with the Music Bike, a system that generates adaptive music in response to the bike’s motion.

The Music Bike has a set of sensors controlled by an ESP32-S3 mounted beneath the bike seat. The ESP32 transmits the data it collects over BLE to an Android app, which in turn uses the FMOD Studio adaptive sound engine to generate the music played. An MPU9250 IMU collects most position and motion data, supplemented by a hall effect sensor which tracks wheel speed and direction of rotation.

When the Android app receives sensor data, it performs some processing to detect the bike’s actions, then uses these to control FMOD’s output. The students tried using machine learning to detect bike tricks, but had trouble with latency and accuracy, so they switched to a threshold classifier. They were eventually able to detect jumps, 180-degree spins, forward and reverse motion, and wheelies. FMOD uses this information to modify music pitch, alter instrument layering, and change the track. The students gave an impressive in-class demonstration of the system in the video below (the demonstration begins at 4:30).

Surprisingly enough, this isn’t the first music-producing bike we’ve featured here. We’ve also seen a music-reactive bike lighting system.

Thanks to [Blake Hannaford] for the tip!

Honey, I Blew Up The Line Follower Robot

21 Mayo 2025 at 20:00
[Austin Blake] sitting on line follower cart in garage

Some readers may recall building a line-following robot during their school days. Involving some IR LEDs, perhaps a bit of LEGO, and plenty of trial-and-error, it was fun on a tiny scale. Now imagine that—but rideable. That’s exactly what [Austin Blake] did, scaling up a classroom robotics staple into a full-size vehicle you can actually sit on.

The robot uses a whopping 32 IR sensors to follow a black line across a concrete workshop floor, adjusting its path using a steering motor salvaged from a power wheelchair. An Arduino Mega Pro Mini handles the logic, sending PWM signals to a DIY servo. The chassis consists of a modified Crazy Cart, selected for its absurdly tight turning radius. With each prototype iteration, [Blake] improved sensor precision and motor control, turning a bumpy ride into a smooth glide.

The IR sensor array, which on the palm-sized vehicle consisted of just a handful of components, evolved into a PCB-backed bar nearly 0.5 meters wide. Potentiometer tuning was a fiddly affair, but worth it. Crashes? Sure. But the kind that makes you grin like your teenage self. If it looks like fun, you could either build one yourself, or upgrade a similar LEGO project.

A UV Meter For The Flipper Zero

19 Mayo 2025 at 18:00
flipper zero uv sensor

We all know UV radiation for its contributions to getting sunburned after a long day outside, but were you aware there are several types different types of UV rays at play? [Michael] has come up with a Flipper Zero add on board and app to measure these three types of radiation, and explained some of the nuances he learned about measuring UV along the way.

At the heart of this project is an AS7331 sensor, it can measure the UV-A, UV-B, and UV-C radiation values that the Flipper Zero reads via I2C. While first using this chip he realized to read these values is more complex than just querying the right register, and by the end of this project he’d written his own AS7331 library to help retrieve these values. There was also a some experimenting with different GUI designs for the app, the Flipper Zero screen is only 128x64px and he had a lot of data to display. One feature we really enjoyed was the addition of the wiring guide to the app, if you install this Flipper Zero app and have just the AS7331 sensor on hand you’ll know how to hook it up. However if you want he also has provided the design files for a PCB that just plugs into the top of the Flipper Zero.

Head over to his site to check out all the details of this Flipper Zero project, and to learn more about the different types of UV radiation. Also be sure to let us know about any of your Flipper Zero projects.

Lancing College Shares Critical Design Review for UK CanSat Entry

5 Mayo 2025 at 05:00
UK CanSat Competition, Space Ex, Lancing College, Critical Design Review

A group of students from Lancing College in the UK have sent in their Critical Design Review (CDR) for their entry in the UK CanSat project.

Per the competition guidelines the UK CanSat project challenges students aged 14 to 19 years of age to build a satellite which can relay telemetry data about atmospheric conditions such as could help with space exploration. The students’ primary mission is to collect temperature and pressure readings, and these students picked their secondary mission to be collection of GPS data, for use on planets where GPS infrastructure is available, such as on Earth. This CDR follows their Preliminary Design Review (PDR).

The six students in the group bring a range of relevant skills. Their satellite transmits six metrics every second: temperature, pressure, altitude reading 1, altitude reading 2, latitude, and longitude. The main processor is an Arduino Nano Every, a BMP388 sensor provides the first three metrics, and a BE880 GPS module provides the following three metrics. The RFM69HCW module provides radio transmission and reception using LoRa.

The students present their plan and progress in a Gantt chart, catalog their inventory of relevant skills, assess risks, prepare mechanical and electrical designs, breadboard the satellite circuitry and receiver wiring, design a PCB in KiCad, and develop flow charts for the software. The use of Blender for data visualization was a nice hack, as was using ChatGPT to generate an example data file for testing purposes. Mechanical details such as parachute design and composition are worked out along with a shiny finish for high visibility. The students conduct various tests to ensure the suitability of their design and then conduct an outreach program to advertise their achievements to their school community and the internet at large.

We here at Hackaday would like to wish these talented students every success with their submission and we hope you had good luck on launch day, March 4th!

The backbone of this project is the LoRa technology and if you’re interested in that we’ve covered that here at Hackaday many times before, such as in this rain gauge and these soil moisture sensors.

Comparing ‘AI’ for Basic Plant Care With Human Brown Thumbs

Por: Maya Posch
30 Abril 2025 at 02:00

The future of healthy indoor plants, courtesy of AI. (Credit: [Liam])
The future of healthy indoor plants, courtesy of AI. (Credit: [Liam])
Like so many of us, [Liam] has a big problem. Whether it’s the curse of Brown Thumbs or something else, those darn houseplants just keep dying despite guides always telling you how incredibly easy it is to keep them from wilting with a modicum of care each day, even without opting for succulents or cactuses. In a fit of despair [Liam] decided to pin his hopes on what we have come to accept as the Savior of Humankind, namely ‘AI’, which can stand for a lot of things, but it’s definitely really smart and can even generate pretty pictures, which is something that the average human can not. Hence it’s time to let an LLM do all the smart plant caring stuff with ‘PlantMom’.

Since LLMs (so far) don’t come with physical appendages by default, some hardware had to be plugged together to measure parameters like light, temperature and soil moisture. Add to this a grow light & a water pump and all that remained was to tell the LMM using an extensive prompt (containing Python code) what it should do (keep the plant alive) and what responses (Python methods) are available. All that was left now was to let the ‘AI’ (Google’s Gemma 3) handle it.

To say that this resulted in a dramatic failure along with what reads like an emotional breakdown (on the side of the LLM) would be an understatement. The LLM insisted on turning the grow light on when it should be off and had the most erratic watering responses imaginable based on absolutely incorrect interpretations of the ADC data (flipping dry vs wet). After this episode the poor chili plant’s soil was absolutely saturated and is still trying to dry out, while the ongoing LLM experiment (with empty water tank) has the grow light blasting more often than a weed farm.

So far it seems like that the humble state machine’s job is still safe from being taken over by ‘AI’, and not even brown thumb folk can kill plants this efficiently.

❌
❌