Vista Normal

Hay nuevos artículos disponibles. Pincha para refrescar la página.
AnteayerSalida Principal

Math AI

Por: EasyWithAI
29 Julio 2024 at 12:17
Math AI is an AI Chrome extension that uses GPT-4 Vision technology to assist with solving complex math problems. This tool allows you to simply screenshot math problems and receive instant, step-by-step solutions. Math AI currently supports 21 languages and has multiple educational modes such as detailed steps for complex problems. The extension offers features […]

Source

Asksia

Por: EasyWithAI
20 Julio 2025 at 12:05
Asksia is an AI tutor and study buddy designed to streamline the entire learning process by combining lecture transcription, document analysis, and intelligent note organization into a single platform. Asksia offers a range of AI problem-solving tools, including a static solver, ap world calculator, ap world sore calculator, physics problem solver, accounting ai solver, chemistry […]

Source

Reminisce.ai

Por: EasyWithAI
25 Agosto 2023 at 12:17
Reminisce.ai is an AI-powered online learning platform that makes it easy and fun to build technology skills and career paths. It uses cheat sheets, quizzes, and games to help you learn IT skills like Kubernetes, React, and AWS. With personalized career coaching, you can develop the right skills for roles like AI Engineer, Blockchain Developer, […]

Source

Solvely

Por: EasyWithAI
12 Septiembre 2024 at 12:24
Solvely is an AI homework-help app that solves math, science, and liberal arts problems with photos. It offers detailed explanations for topics like algebra and calculus and includes a 24/7 AI tutor, making learning straightforward and fun. Solvely also offers an AI quiz generator which can help you turn full texts into interactive and educational […]

Source

Training a Transformer with 1970s-era Technology

30 Marzo 2026 at 02:00

Although generative language models have found little widespread, profitable adoption outside of putting artists out of work and giving tech companies an easy scapegoat for cutting staff, their their underlying technology remains a fascinating area of study. Stepping back to the more innocent time of the late 2010s, before the cultural backlash, we could examine these models in their early stages. Or, we could see how even older technology processes these types of machine learning algorithms in order to understand more about their fundamentals. [Damien Boureille] has put a 60s-era IBM as well as a PDP-11 to work training a transformer algorithm in order to take a closer look at it.

For such old hardware, the task [Damien Boureille] is training his transformer to do is to reverse a list of digits. This is a trivial problem for something like a Python program but much more difficult for a transformer. The model relies solely on self-attention and a residual connection. To fit within the 32KB memory limit of the PDP-11, it employs fixed-point arithmetic and lookup tables to replace computationally expensive functions. Training is optimized with hand-tuned learning rates and stochastic gradient descent, achieving 100% accuracy in 350 steps. In the real world, this means that he was able to get the training time down from hours or days to around five minutes.

Not only does a project like this help understand these tools, but it also goes a long way towards demonstrating that not every task needs a gigawatt datacenter to be useful. In fact, we’ve seen plenty of large language models and other generative AI running on computers no more powerful than an ESP32 or, if you need slightly more computing power, on consumer-grade PCs with or without GPUs.

❌
❌