Vista Normal

Hay nuevos artículos disponibles. Pincha para refrescar la página.
AnteayerSalida Principal

How Sony Mastered the Transistor

5 Septiembre 2024 at 05:00

When you think of Sony, you probably think of a technology company that’s been around forever. However, as [Asianometry] points out, it really formed in the tough years after World War II. The two people behind the company’s formation were an interesting pair. One of them was a visionary engineer and one was a consummate businessman.

While it is hard to imagine today, securing a license to produce transistors was difficult in the early days. What’s worse is, even with the license, it was not feasible to use the crude devices in a radio.

The devices were poor by today’s standards, and while transistors would work at audio frequencies for hearing aids, getting them to work at AM radio frequencies was a challenge. The Sony founders had to decide whether to use alloy transistors or grown crystal transistors.

Western Electric did not want to share its crystal-growing technology, so in 1954, the team created an alloy transistor. However, it failed to work well at radio frequencies, so they shifted to growing crystals, which seemed more amenable to scaling. One of the team tried using phosphorous and indium doping and created a transistor that could work at higher frequencies. But there was a problem.

Despite the transistor’s superior performance, they couldn’t make another one. Common wisdom at the time was that phosphorus doping was a dead end, but it had worked once. It just took time to find the right way to do it. By 1955, they produced usable transistors, even though the yield was at around 5%.

Texas Instruments beat them to market with a transistor radio, the Regency TR-1, in 1954, but in 1955, they produced the TR-55. Of the five transistors inside, some were alloyed transistors, and some were grown crystals. The factory had to hand-select crystal transistors to make each unit work. The radios were on sale for about 19,000 yen (the TR-1 cost about 50 bucks; recall that in 1954, that was nearly $600 in today’s money). Adjusting for inflation, in today’s money, a Japanese teenager would shell out about $850 for the TR-55.

The TR-55 wasn’t the first Sony radio to have transistors. The TR-52 was a prototype, but it had case problems and never made it into the hands of the public. The radio didn’t make it to the United States until 1957. By then, Texas Instruments, Raytheon, and GE all had radios available, too.

It is a fascinating look into the history of an iconic electronics brand and a window into another world that, honestly, wasn’t that long ago. We couldn’t help but note similarities with Apple, who also had a businessman and engineer combination. Sony would go on to innovate in a number of areas, including optical data storage.

A Windows Control Panel Retrospective Amidst a Concerning UX Shift

Por: Maya Posch
3 Septiembre 2024 at 14:00

Once the nerve center of Windows operating systems, the Control Panel and its multitude of applets has its roots in the earliest versions of Windows. From here users could use these configuration applets to control and adjust just about anything in a friendly graphical environment. Despite the lack of any significant criticism from users and with many generations having grown up with its familiar dialogs, it has over the past years been gradually phased out by the monolithic Universal Windows Platform (UWP) based Settings app.

Whereas the Windows control panel features an overview of the various applets – each of which uses Win32 GUI elements like tabs to organize settings – the Settings app is more Web-like, with lots of touch-friendly whitespace, a single navigable menu, kilometers of settings to scroll through and absolutely no way to keep more than one view open at the same time.

Unsurprisingly, this change has not been met with a lot of enthusiasm by the average Windows user, and with Microsoft now officially recommending users migrate over to the Settings app, it seems that before long we may have to say farewell to what used to be an intrinsic part of the Windows operating system since its first iterations. Yet bizarrely, much of the Control Panel functionality doesn’t exist yet in the Settings app, and it remain an open question how much of it can be translated into the Settings app user experience (UX) paradigm at all.

Considering how unusual this kind of control panel used to be beyond quaint touch-centric platforms like Android and iOS, what is Microsoft’s goal here? Have discovered a UX secret that has eluded every other OS developer?

A Simple Concept

The Windows 3.1 Control Panel (1992). (Source: ToastyTech.com)
The Windows 3.1 Control Panel (1992). (Source: ToastyTech.com)

Settings which a user may want to tweak on their computer system range from hardware devices and networks to the display resolution and wallpaper, so it makes sense to put all of these configuration options within an easy to reach and use location. Generally this has meant something akin to a folder containing various clickable icons and accompanying text which together make clear what settings can be configured by opening it. In addition, the same setting dialogs can be accessed using context-sensitive menus, such as when right-clicking on the desktop.

The Windows 98 Control Panel. (Source: ToastyTech.com)
The Windows 98 Control Panel. (Source: ToastyTech.com)

It’s little wonder that for the longest time operating systems have settled for this approach, as it is intuitive, and individual items can have stylized icons that make it even more obvious what settings can be configured by clicking on it, such a keyboard, a mouse, a display, etc. As graphical fidelity increased, so did the styling of these icons, with MacOS, Windows, BeOS and the various desktop environments for OSs like the Linuxes and BSDs all developing their own highly skeuomorphic styles to make their UIs more intuitive and also more pleasant to look at. A good overview of the Windows Control Panel evolution can be found over at the Version Museum website.

The Windows XP Control Panel in 'Classic' view. (2001) (Source: suffieldacademy.org)
The Windows XP Control Panel in ‘Classic’ view. (2001) (Source: suffieldacademy.org)

Coming from the still somewhat subdued style of Windows XP after years of Windows 9x and Windows NT/2000, Windows Vista and Windows 7 cranked this style up to eleven with the Windows Aero design language. This meant glass, color, translucency, depth and high-fidelity icons that made the function of the Control Panel’s individual entries more obvious than ever, creating a masterpiece that would be very hard to beat. The user was also given two different ways to view the Control Panel: the simplified category-based view, or the ‘classic’ view with all icons (and folders for e.g. Administrative Tools) visible in one view.

Windows 7 Control Panel (2009) in category view. (Source: techrepublic.com)
Windows 7 Control Panel (2009) in category view. (Source: techrepublic.com)

Meanwhile Apple did much the same thing, leaning heavily into their unique design language not only for its desktop, but ultimately also for its mobile offerings. Everything was pseudo-3D, with vivid colors adorning detailed renderings of various physical items and so on, creating a true feast for the eyes when taking in these lush UIs, with efficient access to settings via clearly marked tabs and similar UI elements.

The Mac OS X Panther System Preferences in 2003. (Source: Gadget Unity TV)
The Mac OS X Panther System Preferences in 2003. (Source: Gadget Unity TV)

This way of organizing system settings was effectively replicated across a multitude of environments, with operating systems like Haiku (based on BeOS) and ReactOS (re-implementing Windows) retaining those classical elements of the original. A truly cross-platform, mostly intuitive experience was created, and Bliss truly came to the computing world.

Naturally, something so good had no right to keep existing, ergo it had to go.

The World Is Flat

The first to make the big change was Microsoft, with the release of Windows 8 and its Metro design language. This new visual style relied on simple shapes, with little to no adornments or distractions (i.e. more than a single color). Initially Microsoft also reckoned that Windows users wanted every window to be full-screen, and that hot edges and sides rather than a task bar and start menu was the way to go, as every single system running Windows 8 would obviously have a touch screen. Fortunately they did backtrack on this, but their attempt to redesign the Control Panel into something more Metro-like with the Settings app did persist, like an odd growth somewhere on a body part.

Windows 8's PC Settings app (2012). (Source: softpedia.com)
Windows 8’s PC Settings app (2012). (Source: softpedia.com)

Although the Control Panel remained in Windows 8 as well, the course had been set. Over time this small lump developed into the Settings app in Windows 10, by which time Metro had been renamed into the Microsoft Design Language (MDL), which got a recent tweak in what is now called the Fluent Design Language (FDL) for Windows 11.

Central to this is the removal of almost all colors, the use of text labels over icons where possible (though simple monochrome icons are okay) and only rectangles with no decorations. This also meant no folder-centric model for settings but rather all the items put into a text-based menu on the left-hand side and an endless scroll-of-doom on the right side containing sparsely distributed settings.

This led to the absolutely beautifully dystopian Settings app as it exists in Windows 10:

The Settings app in Windows 10 back in ~2015. Hope you don't like colors.
The Settings app in Windows 10 back in ~2015. Hope you don’t like colors.

All of this came as skeuomorphic designs were suddenly considered ‘passé’, and the new hotness was so-called Flat Design. Google’s Material Design as developed in 2014 is another good example of this, with the characteristic ‘flat UI elements adrift in a void’ aesthetic that has now been adopted by Microsoft, and a few years ago by Apple as well starting in 2022 with MacOS Ventura’s System Settings (replacing System Preferences).

Monterey’s General system preferences (left) are different from Ventura’s General system settings (right). (Credit: MacWorld)
Monterey’s General system preferences (left) are different from Ventura’s General system settings (right). (Credit: MacWorld)

Rather than a tabbed interface to provide a clear overview, everything is now a blind hierarchy of menu items to scroll through and activate to access sub-, sub-sub-, and sub-sub-sub- items, and inevitably realize a few times that you’re in the wrong section. But rather than being able to click that other, correct tab, you now get to navigate back multiple views, one click at a time.

It isn’t just Windows and Apple either, but many of the big desktop environments like Gnome have also moved to this Flat Design Language. While various reasons have been provided for these changes, it’s undeniable that FDL makes a UI less intuitive (because there’s less useful visual information) and makes for a worse user experience (UX) with worse ergonomics as a result (because of the extra scrolling and clicking). This is especially obvious in the ‘independent applets’ versus ‘monolithic settings app’ comparison.

One-Track Mind

Imagine that you’re trying out a couple new wallpapers in Windows while keeping an eye on Windows Update’s latest shenanigans. You then need to quickly adjust the default audio device or another small adjustment unrelated to any of these other tasks. If you are using Windows 7 or earlier with the Control Panel applets, this is normal behavior and exceedingly common especially during hardware troubleshooting sessions.

If you’re using the Settings app, this is impossible, as only view can be active at a given time. You think you’re smart and right-click the desktop for ‘Personalize desktop’ so that the other Settings view stays intact? This is not how it works, as the Settings app is monolithic and now shifts to the newly selected view. Currently this is not too noticeable yet as many applets still exist in Windows 10 and 11, but as more and more of these are assimilated into the Settings app, such events will become more and more common.

It would seem that after decades of UI and UX evolution, we have now reached a definite point where UX is only getting worse, arguably around the release of Windows 8. With color banished, anything even remotely pseudo-3D frowned upon and UIs based around touch interfaces, there will soon be no difference between using a desktop PC, tablet or smartphone. Just in the worst way possible, as nobody has ever written about the amazing ergonomics and efficient UX of the latter two devices.

Perhaps our only hope may lie with the OSes and desktop environments that keep things real and stick to decades of proven UX design rather than give into Fad Driven Development.

Rest in peace, Windows Control Panel. We hope to see you again soon in ReactOS.

DEC’s LAN Bridge 100: The Invention of the Network Bridge

Por: Maya Posch
28 Agosto 2024 at 02:00

DEC’s LAN Bridge 100 was a major milestone in the history of Ethernet which made it a viable option for the ever-growing LANs of yesteryear and today. Its history is also the topic of a recent video by [The Serial Port], in which [Mark] covers the development history of this device. We previously covered the LANBridge 100 Ethernet bridge and what it meant as Ethernet saw itself forced to scale from a shared medium (ether) to a star topology featuring network bridges and switches.

Featured in the video is also an interview with [John Reed], a field service network technician who worked at DEC from 1980 to 1998. He demonstrates what the world was like with early Ethernet, with thicknet coax (10BASE5) requiring a rather enjoyable way to crimp on connectors. Even with the relatively sluggish 10 Mbit of thicknet Ethernet, adding an Ethernet store and forward bridge in between two of these networks required significant amounts of processing power due to the sheer number of packets, but the beefy Motorola 68k CPU was up to the task.

To prevent issues with loops in the network, the spanning tree algorithm was developed and implemented, forming the foundations of the modern-day Ethernet LANs, as demonstrated by the basic LAN Bridge 100 unit that [Mark] fires up and which works fine in a modern-day LAN after its start-up procedure. Even if today’s Ethernet bridges and switches got smarter and more powerful, it all started with that first LAN Bridge.

The Famous Computer Cafe Has Now Been Archived Online

Por: Lewin Day
23 Agosto 2024 at 11:00

You might think that TV stations or production houses would be great at archiving, but it’s not always the case. Particularly from the public access perspective. However, if you’re a fan of The Famous Computer Cafe, you’re in luck! The beloved series has now been preserved on The Internet Archive!

If you’re not familiar with the show, it was a radio program broadcast from 1983 to 1986. It was pumped out of a variety of radio stations in southern and central California in the period. The creators making sure to keep a copy of each episode in reel-to-reel tape format. For years, these tapes were tragically lost, until archivist [Kay Savetz] was able to recover some of them from a recent property sale. From there, a GoFundMe paid for digitization, and the show has been placed on The Internet Archive with the blessings of the original creators.

This is quite the cultural victory, particularly when you observe the list of guests on the show. Timothy Leary, Bill Gates, Jack Tramiel, and even Douglas Adams made appearances in the recovered recordings. Sadly, though, not all the tapes have been recovered. Episodes with Gene Roddenberry, Robert Moog, and Ray Bradbury are still lost to time.

If you fancy a listen, 53 episodes presently exist on the archive. Take a trip back in time and hear from some technological visionaries—and futurists—speaking their minds at the very beginning of the microcomputer era! If you find any particularly salient gems, don’t hesitate to drop them on the tip line.

Tech in Plain Sight: Speedometers

22 Agosto 2024 at 14:00

In a modern car, your speedometer might look analog, but it is almost certainly digital and driven by the computer that has to monitor all sorts of things anyway. But how did they work before your car was a rolling computer complex? The electronic speedometer has been around for well over a century and, when you think about it, qualifies as a technlogical marvel.

If you already know how they work, this isn’t a fair question. But if you don’t, think about this. Your dashboard has a cable running into it. The inner part of the cable spins at some rate, which is related to either the car’s transmission or a wheel sensor. How do you make a needle deflect based on the speed?

Mechanical Solutions

Early versions of the speedometer used a governor pulling against a spring. The faster it rotates, the more the two weights of the governor pull out against the spring, and the needle moves with the weights.

As an aside, this sort of centrifugal governor is also known as a fly-ball governor, and similar devices were commonly used to regulate the maximum throttle on steam engines. The arms of the governor would be fully extended once the engine reached its top speed, which lead to the term “balls-out” becoming used to describe a machine operating at its upper limits.

Another type of mechanical speedometer had an escapement like a watch. The time mechanism would move the needle back, and the rotation of the wheels would move it forward. The net result was a needle position that would increase with speed.

The Magnetic Approach

However, most cars use a magnetic type speedometer — although it doesn’t work in the way you might imagine. There’s no reed relay or Hall effect sensing the magnetic field. Instead, there is an aluminum cup attached to the speedometer needle and, nearby, a magnet that spins on a shaft moving at some ratio of the car’s speed. There’s no direct connection between the two.

Being a non-ferrous metal, aluminum is not generally something we think of being affected by magnets. Under normal circumstances that might be true, but a moving magnetic field will induce eddy currents in aluminum. This forms a field in the aluminum, too, and the spinning magnet tends to drag the cup, thereby deflecting the pointer.

A spring similar to one you might find in a mechanical clock or watch pulls back the pointer so the needle hovers at the point where the force of the magnet pulls against the spring. The pull on the spring has to account for the gear ratios and the size of the tires to accurately reflect the vehicle’s speed.

If you want to see an entertaining teardown of an old speedometer, [Tubalcain/Mr Pete] has you covered in the video below. He also shows how the odometer part worked, too.

Modern Times

Of course, these days you are more likely to pick up a pulse using a Hall effect or some other part of the vehicle and just count the pulses in the car’s computer. In fact, the pulses might be encoded at the source and travel over something like a CAN bus to get to the computer.

It is also possible to pick up speed from other tracking information like GPS, although that might not be as accurate. But if you have, for example, a mobile phone app that shows your speed, that’s probably what it is doing. The obvious way to do that is to take position measurements periodically and then do the math. However, more sophisticated systems can actually measure Doppler shift to get a more accurate reading.

We see a lot of bicycle speedometers for some reason. Eddy currents make induction cooktops work, too. Even tiny ones.

The First Mass Produced DRAM of the Soviet Union

Por: Maya Posch
22 Agosto 2024 at 02:00
565RU1 die manufactured in 1981.
KE565RU1A (1985) in comparison with the analogue from AMD (1980)
KE565RU1A (1985) in comparison with the analogue from AMD (1980)

Although the benefits of semiconductor technology were undeniable during the second half the 20th century, there was a clear divide between the two sides of the Iron Curtain. Whilst the First World had access to top-of-the-line semiconductor foundries and engineers, the Second World was having to get by with scraps. Unable to keep up with the frantic pace of the USA’s developments in particular, the USSR saw itself reduced to copying Western designs and smuggling in machinery where possible. A good example of this is the USSR’s first mass-produced dynamic RAM (DRAM), the 565RU1, as detailed by [The CPUShack Museum].

While the West’s first commercially mass-produced DRAM began in 1970 with the Intel 1103 (1024 x 1) with its three-transistor design, the 565RU1 was developed in 1975, with engineering samples produced until the autumn of 1977. This DRAM chip featured a three-transistor design, with a 4096 x 1 layout and characteristics reminiscent of Western DRAM ICs like the Ti TMS4060. It was produced at a range of microelectronics enterprises in the USSR. These included Angstrem, Mezon (Moldova), Alpha (Latvia) and Exciton (Moscow).

Of course, by the second half of the 1970s the West had already moved on to single-transistor, more efficient DRAM designs. Although the 565RU1 was never known for being that great, it was nevertheless used throughout the USSR and Second World. One example of this is a 1985 article (page 2) by [V. Ye. Beloshevskiy], the Electronics Department Chief of the Belorussian Railroad Computer Center in which the unreliability of the 565RU1 ICs are described, and ways to add redundancy to the (YeS1035) computing systems.

Top image: 565RU1 die manufactured in 1981.

Australia Didn’t Invent WiFi, Despite What You’ve Heard

Por: Lewin Day
20 Agosto 2024 at 14:00

Wireless networking is all-pervasive in our modern lives. Wi-Fi technology lives in our smartphones, our laptops, and even our watches. Internet is available to be plucked out of the air in virtually every home across the country. Wi-Fi has been one of the grand computing revolutions of the past few decades.

It might surprise you to know that Australia proudly claims the invention of Wi-Fi as its own. It had good reason to, as well— given the money that would surely be due to the creators of the technology. However, dig deeper, and you’ll find things are altogether more complex.

Big Ideas

The official Wi-Fi logo.

It all began at the Commonwealth Scientific and Industrial Research Organization, or CSIRO. The government agency has a wide-ranging brief to pursue research goals across many areas. In the 1990s, this extended to research into various radio technologies, including wireless networking.

The CSIRO is very proud of what it achieved, crediting itself with “Bringing WiFi to the world.” It’s a common piece of trivia thrown around the pub as a bit of national pride—it was scientists Down Under that managed to cook up one of the biggest technologies of recent times!

This might sound a little confusing to you if you’ve looked into the history of Wi-Fi at all. Wasn’t it the IEEE that established the working group for 802.11? And wasn’t it that standard that was released to the public in 1997? Indeed, it was!

The fact is that many groups were working on wireless networking technology in the 1980s and 1990s. Notably, the CSIRO was among them, but it wasn’t the first by any means—nor was it involved with the group behind 802.11. That group formed in 1990, while the precursor to 802.11 was actually developed by NCR Corporation/AT&T in a lab in the Netherlands in 1991. The first standard of what would later become Wi-Fi—802.11-1997—was established by the IEEE based on a proposal by Lucent and NTT, with a bitrate of just 2 MBit/s and operating at 2.4GHz. This standard operated based on frequency-hopping or direct-sequence spread spectrum technology.  This later developed into the popular 802.11b standard in 1999, which upped the speed to 11 Mbit/s. 802.11a came later, switching to 5GHz and using a modulation scheme based around orthogonal frequency division multiplexing (OFDM).

A diagram from the CSIRO patent for wireless LAN technology, dated 1993.

Given we apparently know who invented Wi-Fi, why are Australians allegedly taking credit? Well, it all comes down to patents. A team at the CSIRO had long been developing wireless networking technologies on its own.  In fact, the group filed a patent on 19 November 1993 entitled “Invention: A Wireless Lan.” The crux of the patent was the idea of using multicarrier modulation to get around a frustrating problem—that of multipath interference in indoor environments. This was followed up with a later US patent in 1996 following along the same lines.

The patents were filed because the CSIRO team reckoned they’d cracked wireless networking at rates of many megabits per second. But the details differ quite significantly from the modern networking technologies we use today. Read the patents, and you’ll see repeated references to “operating at frequencies in excess of 10 GHz.” Indeed, the diagrams in the patent documents refer to transmissions in the 60 to 61 GHz range. That’s rather different from the mainstream Wi-Fi standards established by the IEEE. The CSIRO tried over the years to find commercial partners to work with to establish its technology, however, little came of it barring a short-lived start-up called Radiata that was swallowed up by Cisco, never to be seen again.

Steve Jobs shocked the crowd with a demonstration of the first mainstream laptop with wireless networking in 1999. Funnily enough, the CSIRO name didn’t come up.

Based on the fact that the CSIRO wasn’t in the 802.11 working group, and that its patents don’t correspond to the frequencies or specific technologies used in Wi-Fi, you might assume that the CSIRO wouldn’t have any right to claim the invention of Wi-Fi. And yet, the agency’s website could very much give you that impression! So what’s going on?

The CSIRO had been working on wireless LAN technology at the same time as everyone else. It had, by and large, failed to directly commercialize anything it had developed. However, the agency still had its patents. Thus, in the 2000s, it contested that it effectively held the rights to the techniques developed for effective wireless networking, and that those techniques were used in Wi-Fi standards. After writing to multiple companies demanding payment, it came up short. The CSIRO started taking wireless networking companies to court, charging that various companies had violated its patents and demanding heavy royalties, up to $4 per device in some cases. It contested that its scientists had come up with a unique combination of OFDM multiplexing, forward error correction, and interleaving that was key to making wireless networking practical.

An excerpt from the CSIRO’s Australian patent filing in 1993. The agency’s 1996 US patent covers much of the same ground.

A first test case against a Japanese company called Buffalo Technology went the CSIRO’s way. A follow-up case in 2009 aimed at a group of 14 companies. After four days of testimony, the case would have gone down to a jury decision, many members of which would not have been particularly well educated on the finer points of radio communications. The matter was instead settled for $205 million in the CSIRO’s favor. 2012 saw the Australian group go again, taking on a group of nine companies including T-Mobile, AT&T, Lenovo, and Broadcom. This case ended in a further $229 million settlement paid to the CSIRO.

We know little about what went on in these cases, nor the negotiations involved. Transcripts from the short-lived 2009 case had defence lawyers pointing out that the modulation techniques used in the Wi-Fi standards had been around for decades prior to the CSIRO’s later wireless LAN patent.  Meanwhile, the CSIRO stuck to its guns, claiming that it was the combination of techniques that made wireless LAN possible, and that it deserved fair recompense for the use of its patented techniques.

Was this valid? Well, to a degree, that’s how patents work. If you patent an idea, and it’s deemed unique and special, you can generally demand a payment others that like to use it. For better or worse, the CSIRO was granted a US patent for its combination of techniques to do wireless networking. Other companies may have come to similar conclusions on their own, but that didn’t get a patent for it and that left them open to very expensive litigation from the CSIRO.

However, there’s a big caveat here. None of this means that the CSIRO invented Wi-Fi. These days, the agency’s website is careful with the wording, noting that it “invented Wireless LAN.”

The CSIRO has published several comics about the history of Wi-Fi, which might confuse some as to the agency’s role in the standard. This paragraph is a more reserved explanation, though it accuses other companies of having “less success”—a bold statement given that 802.11 was commercially successful, and the CSIRO’s 60 GHz ideas weren’t. Credit: CSIRO website via screenshot

It’s certainly valid to say that the CSIRO’s scientists did invent a wireless networking technique. The problem is that in the mass media, this has commonly been transliterated to say that the agency invented Wi-Fi, which it obviously did not. Of course, this misconception doesn’t hurt the agency’s public profile one bit.

Ultimately, the CSIRO did file some patents. It did come up with a wireless networking technique in the 1990s. But did it invent Wi-Fi? Certainly not. And many will contest that the agency’s patent should not have earned it any money from equipment built to standards it had no role in developing. Still, the myth with persist for some time to come. At least until someone writes a New York Times bestseller on the true and exact history of the real Wi-Fi standards. Can’t wait.

The First Air Force One and How it Was Nearly Lost Forever

Por: Maya Posch
17 Agosto 2024 at 11:00
For years, the first Air Force One sat neglected and forgotten in an open field at Arizona’s Marana Regional Airport. (Credit: Dynamic Aviation)

Although the designation ‘Air Force One’ is now commonly known to refer to the airplane used by the President of the United States, it wasn’t until Eisenhower that the US President would make significant use of a dedicated airplane. He would have a Lockheed VC-121A kitted out to act as his office as commander-in-chief. Called the Columbine II after the Colorado columbine flower, it served a crucial role during the Korean War and would result the coining of the ‘Air Force One’ designation following a near-disaster in 1954.

This involved a mix-up between Eastern Air Lines 8610 and Air Force 8610 (the VC-121A). After the Columbine II was replaced with a VC-121E model (Columbine III), the Columbine II was mistakenly sold to a private owner, and got pretty close to being scrapped.

In 2016, the plane made a “somewhat scary and extremely precarious” 2,000-plus-mile journey to Bridgewater, Virginia, to undergo a complete restoration. (Credit: Dynamic Aviation)
In 2016, the plane made a “somewhat scary and extremely precarious” 2,000-plus-mile journey to Bridgewater, Virginia, to undergo a complete restoration. (Credit: Dynamic Aviation)

Although nobody is really sure how this mistake happened, it resulted in the private owner stripping the airplane for parts to keep other Lockheed C-121s and compatible airplanes flying. Shortly before scrapping the airplane, he received a call from the Smithsonian Institution, informing him that this particular airplane was Eisenhower’s first presidential airplane and the first ever Air Force One. This led to him instead fixing up the airplane and trying to sell it off. Ultimately the CEO of the airplane maintenance company Dynamic Aviation, [Karl D. Stoltzfus] bought the partially restored airplane after it had spent another few years baking in the unrelenting sun.

Although in a sorry state at this point, [Stoltzfus] put a team led by mechanic [Brian Miklos] to work who got the airplane in a flying condition by 2016 after a year of work, so that they could fly the airplane over to Dynamic Aviation facilities for a complete restoration. At this point the ‘nuts and bolts’ restoration is mostly complete after a lot of improvisation and manufacturing of parts for the 80 year old airplane, with restoration of the Eisenhower-era interior and exterior now in progress. This should take another few years and another $12 million or so, but would result in a fully restored and flight-worthy Columbine II, exactly as it would have looked in 1953, plus a few modern-day safety upgrades.

Although [Stoltzfus] recently passed away unexpectedly before being able to see the final result, his legacy will live on in the restored airplane, which will after so many years be able to meet up again with the Columbine III, which is on display at the National Museum of the USAF.

A Modern Take on an Old Language

16 Agosto 2024 at 14:00

Some old computer languages are destined to never die. They do, however, evolve. For example, Fortran, among the oldest of computer languages, still has adherents, not to mention a ton of legacy code to maintain. But it doesn’t force you to pretend you are using punched cards anymore. In the 1970s, if you wanted to crunch numbers, Fortran was a good choice. But there was another very peculiar language: APL. Turns out, APL is alive and well and has a thriving community that still uses it.

APL has a lot going for it if you are crunching serious numbers. The main data type is a multidimensional array. In fact, you could argue that a lot of “modern” ideas like a REPL, list types, and even functional programming entered the mainstream through APL. But it did have one strange thing that made it difficult to use and learn.

[Kenneth E. Iverson] was at Harvard in 1957 and started working out a mathematical notation for dealing with arrays. By 1960, he’d moved to IBM and a few years later wrote a book entitled “A Programming Language.” That’s where the name comes from — it is actually an acronym for the book’s title. Being a mathematician, [Iverson] used symbols instead of words. For example, to create an array with the numbers 1 to 5 in it and then print it, you’d write:

⎕←⍳5

Since modern APL has a REPL (read-eval-print loop), you could remove the box and the arrow today.

What Key Was That?

Wait. Where are all those keys on your keyboard? Ah, you’ve discovered the one strange thing. In 1963, CRTs were not very common. While punched cards were king, IBM also had a number of Selectric terminals. These were essentially computer-controlled typewriters that had type balls instead of bars that were easy to replace.

With the right type ball, you could have 26 upper-case letters, 10 digits, a few control characters, and then a large number of “weird” characters. But it is actually worse than that. The available symbols were still not numerous enough for APL’s appetite. So some symbols required you to type part of the symbol, press backspace, then type more of the symbols, sometimes repeating the process several times. On a printing terminal, that works fine. For the CRTs that would soon take over, this was tough to do.

For example, a comment (like a REM in Basic or a // in C++) is represented by a thumbnail (⍝). In other words, this would be an APL comment:

⍝ This is a comment

To make that character, you’d type the “arch” part, backspace, then the “dot” part. Not very speedy. Not very practical on old CRT terminals, either.

The characters aren’t the only strange thing. For example, APL evaluates math right to left.

That is, 3×2+5 is 21 because the 2+5 happens first. You just have to get used to that.

A Solution

Of course, modern screens can handle this easily and most people use an APL keyboard mapping that looks like your normal keyboard, but inserts special symbols when you use the right Alt key (with or without the shift modifier). This allows the keyboard to directly enter every possible symbol.

Of course, your keyboard’s keycaps probably don’t have those symbols etched in, so you’ll probably want a cheat sheet. You can buy APL keycaps or even entire keyboards if you really get into it.

What’s GNU With You?

While there have been many versions of APL over the years, GNU APL is certainly the easiest to setup, at least for Linux. According to the website, the project has more than 100,000 lines of C++ code! It also has many modern things like XML parsers.

A US APL keyboard layout

The real trick is making your keyboard work with the stranger characters. If you are just playing around, you can consider doing nothing. You can see the keyboard layout by issuing the ]KEYBD command at the APL prompt. That will give you something like the adjacent keyboard layout image.

From that image, you can copy and paste odd characters. That’s a pain, though. I had good luck with this command line:

setxkbmap -layout us,apl -variant ,dyalog -option grp:switch

With this setup, I can use the right alt key to get most APL characters. I never figured out how to get the shifted alternate characters, though. If you want to try harder, or if you use a different environment than I do, you might read the APL Wiki.

An Example

Rather than do a full tutorial, here’s my usual binary search high low game. The computer asks you to think of a number, and then it guesses it. Not the best use of APL’s advanced math capabilities, but it will give you an idea of what it can do.

Here’s a survival guide. The upside-down triangle is the start or end of a function. You already know the thumbnail is a comment. A left-pointing arrow is an assignment statement. A right-pointing arrow is a goto (this was created in the 1960s; modern APL has better control structures, but they can vary between implementations).  Square boxes are for I/O, and the diamond separates multiple statements on a single line.


∇ BinarySearchGame
⍝ Initialize variables
lower ← 1
upper ← 1024
turns ← 0
cheating ← 0

⍝ Start the game
'Think of a number between 1 and 1024.' ⋄ ⎕ ← ''

Loop:
turns ← turns + 1
guess ← ⌊(lower + upper) ÷ 2 ⍝ Make a guess using binary search

⍞ ← 'Is your number ', ⍕ guess, '? (h for high, l for low, c for correct): '
response ← ⍞

→ (response = 'c')/Finish ⍝ Jump to Finish if correct
→ (response = 'h')/TooHigh ⍝ Jump to TooHigh if too high
→ (response = 'l')/TooLow ⍝ Jump to TooLow if too low
→ InvalidInput ⍝ Invalid input

TooHigh:
upper ← guess - 1
→ (lower > upper)/CheatingDetected ⍝ Detect cheating
→ Loop

TooLow:
lower ← guess + 1
→ (lower > upper)/CheatingDetected ⍝ Detect cheating
→ Loop

InvalidInput:
⍞ ← 'Invalid input. Please enter "h", "l", or "c".' ⋄ ⎕ ← ''
turns ← turns - 1 ⍝ Invalid input doesn't count as a turn
→ Loop

CheatingDetected:
⍞ ← 'Hmm... Something doesn''t add up. Did you make a mistake?' ⋄ ⎕ ← ''
cheating ← 1
→ Finish

Finish:
→ (cheating = 0)/Continue ⍝ If no cheating, continue
→ EndGame

Continue:
⍞ ← 'Great! The number is ', ⍕ guess, '. It took ', ⍕ turns, ' turns to guess it.' ⋄ ⎕ ← ''

EndGame:
⍞ ← 'Would you like to play again? (y/n): '
restart ← ⍞
→ (restart = 'y')/Restart ⍝ Restart the game if 'y'
→ Exit ⍝ Exit the game otherwise

Restart:
BinarySearchGame ⍝ Restart the game

Exit:
⍞ ← 'Thank you for playing!' ⋄ ⎕ ← '' ⍝ Exit message
∇

What’s Next?

If you want to get an idea of how APL’s special handling of data make some programs easier, the APL Wiki has a good page for that. If you don’t want to install anything, you can run APL in your browser (although it is the Dyalog version, a very common choice for modern APL).

If you don’t want to read the documentation, check out [phoebe’s] video below. We always wanted the IBM computer that had the big switch to go from Basic to APL.

APL Keyboard image via Reddit

The First Fitbit: Engineering and Industrial Design Lessons

Por: Maya Posch
9 Agosto 2024 at 20:00

It could happen to anyone of us: suddenly you got this inkling of an idea for a product that you think might just be pretty useful or even cool. Some of us then go on to develop a prototype and manage to get enough seed funding to begin the long and arduous journey to turn a sloppy prototype into a sleek, mass-produced product. This is basically the story of how the Fitbit came to be, with a pretty in-depth article by [Tekla S. Perry] in IEEE Spectrum covering the development process and the countless lessons learned along the way.

Of note was that this idea for an accelerometer-based activity tracker was not new in 2006, as a range of products already existed, from 1960s mechanical pedometers to 1990s medical sensors and the shoe-based Nike+ step tracker that used Apple’s iPod with a receiver. Where this idea for the Fitbit was new was that it’d target a wide audience with a small, convenient (and affordable) device. That also set them up for a major nightmare as the two inventors were plunged into the wonderfully terrifying world of industrial design and hardware development.

One thing that helped a lot was outsourcing what they could to skilled people and having solid seed funding. This left just many hardware decisions to make it as small as possible, as well as waterproof and low-power. The use of the ANT protocol instead of Bluetooth saved a lot of battery, but meant a base station was needed to connect to a PC. Making things waterproof required ultrasonic welding, but lack of antenna testing meant that a closed case had a massively reduced signal strength until a foam shim added some space. The external reset pin on the Fitbit for the base station had a low voltage on it all the time, which led to corrosion issues, and so on.

While much of this was standard development and testing  fun, the real challenge was in interpreting the data from the accelerometer. After all, what does a footstep look like to an accelerometer, and when is it just a pothole while travelling by car? Developing a good algorithm here took gathering a lot of real-world data using prototype hardware, which needed tweaking when later Fitbits moved from being clipped-on to being worn on the wrist. These days Fitbit is hardly the only game in town for fitness trackers, but you can definitely blame them for laying much of the groundwork for the countless options today.

Polaroid in an Instant

31 Julio 2024 at 14:00

Edwin Land, were he alive, would hate this post. He wanted to be known for this scientific work and not for his personal life. In fact, upon his death, he ordered the destruction of all his personal papers. However, Land was, by our definition, a hacker, and while you probably correctly associate him with the Polaroid camera, that turns out to be only part of the story.

Land in 1977

It was obvious that Land was intelligent and inquisitive from an early age. At six, he blew all the fuses in the house. He was known for taking apart clocks and appliances. When his father forbade him from tearing apart a phonograph, he reportedly replied that nothing would deter him from conducting an experiment. We imagine many Hackaday readers have similar childhood stories.

Optics

He was interested in optics, and at around age 13, he became interested in using polarized light to reduce headlight glare. The problem was that one of the best polarizing crystals known — herapathite — was difficult to create in a large size. Herapathite is a crystalline form of iodoquinine sulfate studied in the 1800s by William Herapath, who was unable to grow large sizes of the crystal. Interestingly, one of Herapath’s students noticed the crystals formed when adding iodine to urine from dogs that were given quinine.

Land spent a year at Harvard studying physics, but he left and moved to New York. He continued trying to develop a way to make large, practical, light-polarizing crystals. At night, he would sneak into labs at Columbia University to conduct experiments.

His breakthrough was the realization that he could develop tiny polarizing crystals and put millions of them in a film to form a large polarizer without the problem of growing giant crystals. At first, he created tiny crystals, suspended them in liquid, and aligned them with an electromagnet. A sheet of celluloid would pass through the liquid, picking up precisely aligned microcrystals. When the liquid dried, the crystals remained, and you had a sheet of polarizing film.

A Polarizing Patent

Two misaligned filters will pass less light until reaching 90 degrees of misalignment, which will block most light

That was the basis of the 1929 patent for polarizing films. Later, the process changed to using a polymer sheet with crystals that aligned by stretching the plastic without an electromagnet. Eventually, the crystals would be made of iodine. Not only did polarizing filters reduce glare, but using two of them allowed you to control the flow of light. If the two filters have the same alignment, light with the correct polarization will pass. As you rotate one filter, less light will pass until the polarizers are at right angles to each other. At that point, virtually no light will flow. Polariscopes can even detect stress in glass objects.

In 1932, a Havard professor who had family money joined with Land to form Land-Wheelwright Laboratories to manufacture polarizing films. You’d think that wouldn’t be a big business, but it turns out there were many uses for a large polarizer, although auto headlights didn’t work out. Kodak bought polarizing film for movie cameras. American Optical made polarized sunglasses. It even made 3D movies and photographs more practical.

In 1937, the company changed its name to Polaroid. But it would be 1943 before the Polaroid camera was even an idea. Of course, between those years, there was a World War to contend with.

The company sold many 3D movie cameras. They produced a 3D film for the 1939 World’s Fair. Unfortunately, the right eye film has been lost, but the left eye one is still around, and you can see it below.

War Years

Turns out polarizing films have more military uses than you might guess. Pilots and soldiers benefit from polarized goggles. A 1944 magazine article noted that all fire control teams had polarizing goggles that could adjust their darkness by turning a knob. Polaroid even produced goggles for war dogs and mules. Even General George Patton was seen sporting a pair of Polaroid goggles.

While most of the company’s war effort was optical in nature, it wasn’t all polarized light technology. For example, the company also developed synthetic quinine after the war shut off the supply of tree bark normally used to produce the medicine. While that might seem odd, at the time, quinine crystals were used in the polarizing films produced by the company. The work ultimately didn’t pan out for practical purposes, but it did win the Polaroid researcher responsible a Nobel Prize in 1965, as it was a landmark achievement in organic chemistry.

Before the war, a Polaroid employee made the Vectograph, a stereo viewer that encoded depth information in the form of polarization. During the war, the technique was used to enhance reconnaissance photos. Land and his company also played important roles in future photo intelligence development. He contributed to the U2’s camera and several satellite- and balloon-borne cameras.

The Camera

Of course, what Land is really known for is instant photography. Inspiration struck in 1943 while on vacation in Santa Fe, New Mexico. He took a picture of his three-year-old daughter. She wanted to see the resulting picture right away.

That wasn’t possible, of course, but it got Land thinking. Reportedly, in an hour, he had the basic ideas in place to make the system work. Within three years, he had a prototype. Two years after that, the camera was on sale to the public.

The camera used a technique known as diffusion transfer that was known before Land used it and made it practical for cameras. Prior to this, it was used to copy documents and produce lithography plates before being replaced with more modern techniques.  The company made 60 cameras and put 57 of them on a shelf in a department store, thinking they would have some time to make more. The cameras were all sold in a single day, as you can see in the video below. Later, a demonstration by Steve Allen on national television undoubtedly sold many cameras.

The secret isn’t so much in the camera as in the film. In the original process, silver halide — just like regular film — turns black where the light hits it and doesn’t blacken where the image was dark. A dye transfer process migrates dye to the surface of the picture, being blocked where the image is black. This produces a positive image. This requires a series of chemical reactions.

To start the reactions, the reagents are lumped together at the edge of the picture. Rollers in the camera crush capsules containing the reagents and spread them across the picture. For color photos, there are multiple light-sensitive layers and complementary dyes.

In early cameras, the development occurred in the middle of a pack, and after a delay, the user had to separate the image from the rest of the pack. However, in 1972, integral film appeared, which used more chemical magic to develop the image right in front of your eyes.

Genius

Now, when you hear of Edwin Land, you know he did more than invent the instant camera. Not bad for someone who dropped out of school twice. He did, eventually, get an honorary PhD from Harvard. In fact, Harvard’s Baker Library has a great exhibit about Land and his work if you want a lot more detail.

If you have an instant camera, you can build your own film packs. Despite digital photography, we are still fascinated with these instant cameras.

Journey

Por: EasyWithAI
6 Julio 2023 at 15:22
Journey lets you tell captivating stories and presentations using videos, slides, and interactive elements like calendars. The tool offers features such as automagical creation, personalized content at scale, automatic branding, and a trove of customizable blocks. It even generates a first draft using AI, ensuring you never have to start from scratch. To get started […]

Source

The Mysterious Roman Dodecahedron Was Possibly Just For Knitting

Por: Maya Posch
14 Julio 2024 at 08:00

Over the years archaeological digs of Roman sites have uncovered many of these strange dodecahedrons, usually made out of metal and with various holes in their faces. With no surviving records that describe how they were used, speculation has ranged from jewelry to a knitting aid. In a 2023 video by [Amy Gaines] it is this latter use which is explored, using a 3D printed dodecahedron and some wooden dowels to knit both gold wire and yarn into rather intricate patterns that are also referred to as ‘Viking Knitting’.

As we mentioned previously when yet another one of these dodecahedrons was uncovered, their use was unlikely to be of supreme relevance in military or scientific circles on account of a lack of evidence. What is quite possible is that these were both attractive shapes for jewelry (beads), and useful knitting aids for both jewelry makers (for e.g. gold wire braiding) and quite possibly yarn-related uses. The results which [Amy] demonstrates in the video for the gold wire in particular bear a striking resemblance to ancient braided gold chains on display at the Met and other museums, which leads credence to this theory.

If these items were effectively just common knitting tools, that would explain why the historical record is mum on them, as they would have been as notable as a hammer or a precision lathe used by the ancient Greeks.

Thanks to [john] for the tip.

Rulers of the Ancient World — Literally!

13 Julio 2024 at 08:00

If you were expecting a post about ancient kings and queens, you are probably at the wrong website. [Burn Heart] has a fascination with ancient measuring devices and set out to recreate period-correct rules, although using decidedly modern techniques.

The first example is a French rule for measuring the “pied du Roi” or king’s foot. Apparently, his royal highness had large feet as a the French variant is nearly 13 inches long. The next rulers hail from Egypt and measure cubits and spans. Turns out the pyramid builders left a lot of information about measurements and their understanding of math and tools like dividers.

Other rules from Rome, Japan, and the Indus Valley are also included. According to the post, one set of these rulers used locally sourced wood, but a second “limited” edition used wood that the originals might have. Most of the rulers were etched via CNC, although the French ruler was hand-etched.

The Romans, apparently, had smaller feet than French royalty, as their Pes or foot was about 11.65 inches. There are plenty of little tidbits in the post ranging from the origin of the word inch to why the black wood used for piano keys is called ebony.

We’ll stipulate this isn’t exactly a hack, although it is fine workmanship and part of hacker culture is obsessing over measuring things, so we thought it was fair game. These days, rulers are often electronic. Which makes it natural to put them on a PC board.

Flintlock: The Siege of Dawn muestra algunos de sus Jefes con el Trailer Oficial de la Historia

El desarrollador neozelandés A44 Games y la distribuidora Kepler Interactive han revelado hoy un nuevo tráiler sobre la historia del RPG de acción, Flintlock: The Siege of Dawn, el cual se lanzará mundialmente el 18 de julio para Xbox Series X|S, PlayStation 5 y PC (a través de Steam y Epic Games Store) y Xbox Game Pass. Recuerden que una demo aún se encuentra disponible en Steam.

En el papel de Nor, los jugadores trabajarán junto a un misterioso pero poderoso dios con forma de zorro conocido como Enki, para buscar venganza contra los formidables dioses que amenazan con destruir las tierras de Kian.

Desde Uru, la gran guardiana alada de la puerta al Gran Abismo, hasta la multi-armada Rammuha, quien desprecia y teme el poder explosivo de la pólvora, Nor deberá utilizar todo su arsenal, incluyendo armas de fuego, golpes de hacha y la devastadora magia de Enki, para superar las adversidades y devolver la paz al mundo de Kian.

Reservalo ya en Steam por $39.99 dólares (o su equivalente en moneda regional) en su edición estándar o por US$ 44.99 o precio regional, ambas con un 10% de descuento en Steam hasta el 25 de julio.

Flintlock: The Siege of Dawn – Deluxe Edition incluye el juego base y 3 paquetes personalizados para Nor.

  • Flintlock: The Siege of Dawn – Noble Outfit
  • Flintlock: The Siege of Dawn – Champions Outfit
  • Flintlock: The Siege of Dawn – Vanguard Outfit

 

Acerca de Flintlock: The Siege of Dawn

De la mano del estudio A44, creadores de Ashen, llega un souls-lite explosivo en el que dioses y armas se baten por el futuro de la humanidad.

La Entrada al Gran Abismo se ha abierto y, con ella, los dioses y sus huestes de inertes. Las tierras de Kian se encuentran bajo asedio y la ciudad de Alba está al borde de la expugnación.

Ahora le toca al ejército de la Coalición tomar las armas. Haz acopio de venganza, pólvora y magia a lo largo de tu legendaria gesta para derrocar a los dioses, cerrar aquella entrada y reclamar el mundo.

Flintlock: The Siege of Dawn se desarrolla en un mundo abierto impresionantemente bello, pero peligroso, en donde la magia y las armas chocan. Tomando el control de Nor Vanek, junto a su místico compañero, Enki, los jugadores se encuentran en una batalla sin cuartel contra los dioses y su amenazador ejército de muertos.

Deberán perfeccionar sus habilidades de combate utilizando una apasionante mezcla de pólvora y magia para recuperar la perdida Ciudad del Amanecer mientras el mundo que les rodea se sume en el caos.

Cambia sin problemas de la magia al combate con armas y asóciate con Enki para lanzar ataques devastadores contra enemigos imponentes. Mientras los dioses intentan dominar todo lo que llaman hogar, Nor debe convertirse en algo más que una soldado si quiere salvar a la humanidad.

Características principales:

  • Una compañía única –Ponte en las botas de Nor Vanek, miembro de élite del ejército de la Coalición acompañada por Enki, un misterioso ser raposo, en su viaje para vengarse de los dioses. Guiada por la sabiduría de Enki acerca de este mundo, tus habilidades de combate y exploración se combinarán con sus poderes mágicos, convirtiendo a ambos en un dúo a respetar.
  • Un soulslite explosivo – Flintlock incorpora elementos del género soulslike y los imbuye de una velocidad que resulta en combates dinámicos y explosivos. Entrelaza combate físico, armas de fuego y magia en batallas rítmicas donde los combos se encadenan para crear una danza letal. Usa las habilidades con las armas de Nor para emplear la verticalidad a tu favor y hacer llover la muerte desde las alturas o para alejarte rápidamente del peligro.
  • Un mundo herido por los dioses – Libres para azotar el mundo, los dioses sedientos de venganza están enfurecidos, con la intención de causar el caos y provocar la caída de la humanidad. Viaja por el paisaje arrasado de Kian y utiliza la magia y los portales de Enki para surcar los cielos mientras te enfrentas a las hordas de inertes. Descubre nuevo equipo, refuerza tus armas y mejora tus artefactos para prepararte para el enfrentamiento final contra los dioses.

Requisitos Mínimos:

  • SO: Windows 10
  • Procesador: Intel Core i5-8400 / AMD Ryzen 3 3300X
  • Memoria: 8 GB de RAM
  • Gráficos: GTX 1060 / Radeon RX 580 (6GB+ RAM)
  • DirectX: Versión 11
  • Almacenamiento: 30 GB de espacio disponible
  • Notas adicionales: SSD

Requisitos Recomendados:

  • SO: Windows 10
  • Procesador: Intel Core i7-8700K / AMD Ryzen 5 3600X
  • Memoria: 16 GB de RAM
  • Gráficos: GTX 2060 Super / Radeon RX 5700 (8GB+ RAM)
  • DirectX: Versión 12
  • Almacenamiento: 30 GB de espacio disponible
  • Notas adicionales: SSD

La entrada Flintlock: The Siege of Dawn muestra algunos de sus Jefes con el Trailer Oficial de la Historia apareció primero en PC Master Race Latinoamérica.

Dad? Where Did Printed Circuit Boards Come From?

6 Julio 2024 at 02:00

These days, it is hard to imagine electronics without printed circuit boards. They are literally in everything. While making PCBs at home used to be a chore, these days, you design on a computer, click a button, and they show up in the mail. But if you go back far enough, there were no PC boards. Where did they come from? That’s the question posed by [Steven Leibson] who did some investigating into the topic.

There were many false starts at building things like PCBs using wires glued to substrates or conductive inks.  However, it wasn’t until World War II that mass production of PC boards became common. In particular, they were the perfect solution for proximity fuzes in artillery shells.

The environment for these fuzes is harsh. You literally fire them out of a cannon, and they can feel up to 20,000 Gs of acceleration. That will turn most electronic circuits into mush.

The answer was to print silver-bearing ink on a ceramic substrate. These boards contained tubes, which also needed special care. Two PCBs would often have components mounted vertically in a “cordwood” configuration.

From there, of course, things progressed rapidly. We’ve actually looked at the proximity fuze before. Not to mention cordwood.

A Look Back at the USSR’s Mi-6 Helicopter Airliner

Por: Maya Posch
5 Julio 2024 at 20:00

Most of us would equate commercial airline travel with fixed-wing aircraft, but civilian transport by helicopter, especially in large and sparsely populated regions, is common enough. It was once even big business in the Soviet Union, where the Aeroflot airline operated passenger helicopters in regular service for many decades. In the mid-1960s they even started work on converting the Mil Mi-6 — the USSR’s largest and fastest helicopter — to carry paying passengers. Unfortunately this never got past a single prototype, with the circumstances described by [Oliver Parken] in a recent article.

This passenger version of the Mi-6 got the designation Mi-6P (for passazhirskyi, meaning passenger) and would have seated up to 80 (3 + 2 row configuration), compared to the Mi-8 passenger variant that carried 28 – 31 passengers. Why exactly the Mi-6P never got past the prototype stage is unknown, but its successor in the form of the Mi-26P has a listed passenger variant and features. Both have a cruising speed of around 250 km/h, with a top of 300 km/h. The auxiliary winglets of the Mi-6 provided additional lift during flight, and the weight lifting record set by the Mi-6 was only broken by the Mi-26 in 1982.

An obvious disadvantage of passenger helicopters is that they are more complicated to operate and maintain, while small fixed wing airliners like the ATR 72 (introduced in 1988) can carry about as many passengers, requires just a strip of tarmac to land and take off from, travel about twice as fast as an Mi-6P would, and do not require two helicopter pilots to fly them. Unless the ability to hover and land or take-off vertically are required, this pretty much explains why passenger helicopters are such a niche application. Not that the Mi-6P doesn’t have that certain je ne sais quoi to it, mind.

Elden Ring: Shadow of the Erdtree – Trailer Oficial de la Historia

El editor Bandai Namco y el desarrollador FromSoftware han lanzado el avance oficial de la historia de la expansión de Elden Ring, “Shadow of the Erdtree”.

«Ganador de cientos de galardones, incluido el Juego del año de The Game Awards y el Mejor Juego del año de los Golden Joystick Awards, Elden Ring es el aclamado RPG de acción épico ambientado en un vasto y oscuro mundo de fantasía. Los jugadores se embarcan en una búsqueda épica con la libertad de explorar y aventurarse a su propio ritmo.

La expansión “Shadow of the Erdtree” presenta una historia completamente nueva ambientada en la Tierra de las Sombras llena de misterio, mazmorras peligrosas y nuevos enemigos, armas y equipos.

Descubra territorios inexplorados, enfréntese a adversarios formidables y deléitese con el satisfactorio triunfo de la victoria. Sumérgete en la fascinante interacción de personajes, donde el drama y la intriga se entrelazan, creando una experiencia inmersiva para saborear y disfrutar.»

La expansión Shadow of the Erdtree para ELDEN RING se lanzará el próximo 21 de junio por US$ 39.99 para su edición estándar y US$ 49.99 para la edición premium bundle, que incluye el libro de arte digital y banda sonora original. El contenido de Shadow of the Erdtree requiere el juego base.

Resérvalo ahora y recibe el siguiente contenido adicional para la expansión:  Gesto adicional ELDEN RING Shadow of the Erdtree. Es un gesto que se puede usar en el juego para el contenido de Shadow of the Erdtree. El contenido adicional estará disponible en el juego cuando se lance la expansión. También podrás desbloquear este contenido más adelante en la expansión.

Ganador de cientos de premios que incluyen Juego del año de The Game Awards y Mejor juego del año de Golden Joystick Awards, ELDEN RING es el aclamado juego de rol y acción ambientado en un mundo de fantasía extenso y oscuro. Los jugadores se embarcan en una aventura épica con la libertad de explorar y aventurarse a su ritmo.

Acerca de Elden Ring Shadow of the Erdtree

La expansión Shadow of the Erdtree presenta una nueva historia que se sitúa en la Tierra de la Sombra y está repleta de misterios, calabozos peligrosos y nuevos enemigos, armas y equipamiento.

Descubre territorios inexplorados, enfrenta adversarios formidables y disfruta la agradable sensación de la victoria. Adéntrate en la fascinante interacción entre los personajes, donde el drama y las intrigas se combinan y crean una experiencia envolvente muy disfrutable.

«La Tierra de las Sombras.

Un lugar oscurecido por el Erdtree.

Donde la diosa Marika poso su pie por primera vez.

Una tierra purgada en una batalla nunca contada.

Incendiada por la llama de Messmer.

Fue a esta tierra a la que partió Miquella.

Despojándose de su carne, de su fuerza, de su linaje.

De todas las cosas Doradas.

Y ahora Miquella espera

el regreso de su Señor prometido.

ELDEN RING Shadow of the Erdtree tamaño mapa nuevos jefes armas

ELDEN RING Shadow of the Erdtree ofrece diferentes versiones para los millones de Sin Luz que se enfrentaron al original, así como para los nuevos jugadores, que necesitarán el juego base para jugar. Las siguientes ediciones ya están disponibles para preordenar; A menos que se indique lo contrario a continuación, el juego base no está incluido con la compra.

Ediciones básicas

  • ELDEN RING Shadow of the Erdtree: la versión estándar de la expansión, disponible digitalmente en todas las plataformas.
  • ELDEN RING Shadow of the Erdtree Edition: Paquete especial que incluye el juego base de ELDEN RING (disco) y la expansión ELDEN RING Shadow of the Erdtree (digital); las versiones físicas están disponibles solo para PlayStation®5 y Xbox Series X|S. Disponible digitalmente en todas las plataformas.

Ediciones Premium

  • Premium Bundle: incluye la expansión ELDEN RING Shadow of the Erdtree y viene con un libro de arte digital y contenido adicional de banda sonora digital para la expansión. Disponible digitalmente en todas las plataformas.
  • Deluxe Edition: incluye el juego base de ELDEN RING, la expansión ELDEN RING Shadow of the Erdtree, y viene con libros de arte digital y bandas sonoras digitales tanto para el juego base como para la expansión.Disponible digitalmente en todas las plataformas.

Edición de coleccionista (cantidad limitada)

  • ELDEN RING Shadow of the Erdtree Collector’s Edition: incluye un único código de cupón de la plataforma seleccionada para la expansión ELDEN RING Shadow of the Erdtree y viene con una estatua de ~18″ (46 cm) de «Messmer the Impaler», un libro de arte físico de 40 páginas y una banda sonora digital. Las cantidades de ELDEN RING Shadow of the Erdtree Collector’s Edition son limitadas y exclusivas de la tienda de Bandai Namco Entertainment (el envío está restringido a EE. UU. y Canadá).

Pedido anticipado de mercancía exclusiva (cantidad limitada)

  • Disponible solo en la tienda de Bandai Namco Entertainment en cantidades muy limitadas, los pedidos anticipados ya están disponibles para el casco de Messmer el Empalador de ELDEN RING Shadow of the Erdtree. Este artículo exclusivo incluye una pieza de exhibición única hecha para los más exigentes, distinguidos y empilchados Sin Luz. Esta elaborada réplica del casco del imponente Messmer el Empalador está elaborada con detalles precisos y viene con un certificado de autenticidad numerado. Este artículo está disponible solo hasta agotar existencias y se enviará a partir del 28 de junio (el envío está restringido a EE. UU. y Canadá). Este es solo un artículo coleccionable y no incluye ningún contenido del juego.

Acerca de ELDEN RING

«Sumérgete en una aventura emocionante y decide el destino de un mundo vasto, repleto de intriga y poder. Lucha contra enemigos formidables usando el combate cuerpo a cuerpo distintivo de FromSoftware y descubre una amplia variedad de estrategias creativas que son posibles gracias a la jugabilidad ilimitada de este RPG de acción, ELDEN RING.

Hidetaka Miyazaki, autor de la prestigiosa y aclamada serie de videojuegos DARK SOULS, y George R. R. Martin, autor de la saga de fantasía más vendida de The New York Times, Canción de Hielo y Fuego, dan vida a un nuevo mundo lleno de saberes y relatos fantásticos. Los jugadores se embarcarán en un viaje por un mundo diseñado con una meticulosidad artesanal y rebosante de sangre y engaños, que viene de la mano de una variedad de personajes con motivaciones únicas para ayudar u obstruir el avance de los jugadores, adversarios con antecedentes profundos y criaturas temibles. A lo largo de sus aventuras, los jugadores decidirán el destino de esta tierra maldita a medida que desentrañan sus secretos y mitos.

En un paisaje vasto y bien integrado con progresión natural de clima y hora del día, los jugadores se sumergirán por completo en el mundo de ELDEN RING cuando comiencen su viaje y decidan el camino que seguirán. Viaja a pie o a caballo, a solas o en línea con amigos, por planicies herbosas, pantanos sofocantes y bosques exuberantes. Escala montañas en espiral, entra en castillos impresionantes y descubre otros lugares majestuosos en una escala nunca vista en un título de FromSoftware.

Las opciones de rol y personalización de ELDEN RING les permiten a los jugadores definir su propio estilo de juego único. Se fomenta la experimentación con una amplia variedad de armas, poderes mágicos y habilidades que encontrarás al recorrer el mundo; esto alienta a los jugadores a seguir caminos de avance inexplorados previamente.

ELDEN RING les da a los jugadores la oportunidad de diseñar su propio recorrido por el mundo. Decide si te lanzarás precipitadamente a batallas encarnizadas contra enemigos intimidantes o si aprovecharás los sistemas de sigilo y combate del juego para tomar la delantera. Depende de cada jugador decidir cómo superar la variedad de desafíos que se le presentan.»

Características principales:

  • Un extenso mundo lleno de emoción – Un extenso mundo donde los campos abiertos, con una amplia variedad de situaciones, y las enormes mazmorras, con diseños complejos y tridimensionales, se conectan con total fluidez. Al explorar, te acompaña el entusiasmo por descubrir amenazas desconocidas y abrumadoras, lo que produce una gran sensación de logro.
  • Crea tu propio personaje – Además de personalizar el aspecto de tu personaje, puedes combinar libremente las armas, armadura y magia que desees equipar. Puedes desarrollar tu personaje según tu estilo de juego, como aumentar tu fuerza muscular para convertirte en un fuerte guerrero o dominar la magia.
  • Un drama épico nacido de un mito – Una historia de múltiples capas contada en fragmentos. Un drama épico en el que los diversos pensamientos de los personajes se cruzan en las Tierras Intermedias.
  • Modo en línea único que te conecta indirectamente con otros – Además del modo multijugador, donde puedes conectarte directamente con otros jugadores y viajar junto a ellos, el juego presenta un modo en línea asincrónico único, que te permite sentir la presencia de otros jugadores.

Elden Ring se lanzóen PC vía Steam y consolas de PlayStation y Xbox el 25 de febrero del 2021 por u$s 59.99 o precio regional. También está disponible una Edición Deluxe por u$s 79.99 o precio regional, la cual incluye la banda sonora y el libro de arte en formato digital.

El juego también cuenta con una edición de coleccionista y una edición de coleccionista premium que incluirá una estatua de Malenia de 9″, un libro de arte de 40 páginas, un steelbook, una banda sonora digital, y el juego, y tiene un costo u$s 189.99. La Edición Premium incluirá una replica 1:1 del casco de Malenia y costará u$s 259.99.

Requisitos Mínimos:

  • SO: Windows 10
  • Procesador: Intel Core i5-8400 o AMD Ryzen 3 3300X
  • Memoria: 12 GB de RAM
  • Gráficos: NVIDIA GeForce GTX 1060, 3 GB o AMD Radeon RX 580, 4 GB
  • DirectX: Versión 12
  • Almacenamiento: 60 GB de espacio disponible

Requisitos Recomendados:

  • SO: Windows 11
  • Procesador: Intel Core i7-8700K o AMD Ryzen 5 3600X
  • Memoria: 16 GB de RAM
  • Gráficos: NVIDIA GeForce GTX 1070 (8 GB) o AMD Radeon RX Vega 64 (8 GB)
  • DirectX: Versión 12
  • Almacenamiento: 60 GB de espacio disponible

Pueden leer nuestro análisis de ELDEN RING en este enlace, en donde también comentamos sobre el rendimiento en varias configuraciones. También tenemos una extensa galería con más de 100 imágenes en 4K con detalles al máximo aquí.

Aquellos que quieran jugar a más de 60 FPS y/o tengan monitores con resoluciones ultra wide, pueden descargar mods para desbloquear tanto la tasa de cuadros por segundo como resoluciones 21:9 o 32:9. Todos los archivos e instrucciones los pueden encontrar en este enlace.

La entrada Elden Ring: Shadow of the Erdtree – Trailer Oficial de la Historia apareció primero en PC Master Race Latinoamérica.

❌
❌