The oddity in that 5M is that generally US is patriotic to US products [look at Apple] but I could never understand WHY Commodore didn't make it big, they were in such a good position...
You make a lot of good points
@Karlosjackel. First, that whole patriotic thing...I am looking at Apple, and I'm wondering what happened here with Commodore Amiga. I mean, we have this Toaster - and it's a hell of a solution. With flyer, it's already basically digital back in 1993, with exception of speed of video content transfer that DV/FireWire allowed for FinalCut/iMovie - which is a difference maker to usability. Consumer cameras becoming digital video (DV/FireWire) helps video editing become the killer app consumers want. What is a G4/G5/iMac but a Toaster setup in Pro/Consumer form? That's what many users were buying Apples for. Education, Marketing houses, content creators - suddenly it became a must have to do video. It became THE killer app for Apple. Toaster/LightWave was that exact same thing on the Commodore years earlier, yet as you note and as we know, it doesn't make it big and Commodore fails. So...my point, was it all just too early for the market? Was the market not ready for it just yet? The consumer didn't yet realize this was what they wanted and needed? Did Apple just ride the Amiga coattails by repackaging this for broader consumption?
So, piracy saved the Amiga.... In the end
So maybe it did for a while! Software/Hardware is always a balance. Again, let's look at Apple. They had the corporate/creative customers who were locked into compliance to buy expensive apps like PhotoShop, etc. And so let's look at Apple...initially they are clearly a hardware company, then they take a lot of content creating apps in-house and start to control the software side. Now they are very much a software company, and their control of the software allows them to sell the hardware. Certainly the goal is to balance revenue between the two. But they understand that the killer app is needed, people buy the hardware for it - boom! Trick is to balance and cover costs and present a value to the consumer. Apple certainly does that.
Personally, I'm reflecting on the Amiga even today...like many of us...like
@bdb. It is a very impactful computer historically. I think even today it is making a significant impact to the computing community and getting no revenue reward - as it doesn't exist. (Who was first to FPGA cores?) Look at how strongly the Amiga community feels about it even today. Oh sure, there were only 5M original sold, and many of us bought the computer twice...I had the 500, then I had the 2000. As a sidebar, I've looked at the 1200 and 4000 at the time and just couldn't justify it after having the 2000. Event the amazing tower systems. At that point my 2000 had a GVP 040, a Retina, A2320 flicker fixer, A2091, MegaChip, 286 Bridge...what exactly couldn't I really do that a 1200 could...AGA games? 4000 was nice, but what was I going to do...save a second or two on a transfer or render? So that was certainly an issue for upgrading. I wasn't looking for a 2000, but a 2500 in need of serious cosmetic TLC with a spotless 6.2 just landed in my lap, and I'm looking at it closely today and thinking about it and...well, the 2000 is a heck of a machine. Video slot doesn't take up a Zorro. It has 5 Zorro slots like towers, plus ISA slots, CPU slot - it's a really smart machine. In a bunch of ways the 3000/4000 go backwards and only tower versions of those really offer what the 2000 offered. And recently I got into the DE-10 Nano and the FPGA thing, and when it comes to retro, wasn't Amiga core the driver of this FPGA retro hardware accurate concept that has become such a killer retro experience today? Even as far as Vampire. But back to reflecting on the state of compute today.
Amiga is about to turn 40 years next year, where is compute? Is it better? Every damn time I slide in a 880KB floppy into am Amiga and boot up a multitasking operating system I think to myself...'how did this crap bloat up to 20GB needed for Windows 11?
I am really starting to think that the tech industry is just creating bloatware to make us need new CPUs, GPUs, storage, etc. As in software becoming the pusher of new hardware. I just started using an old iPhone 4s for beach music, and each time I look at it I reflect on how it isn't really that big of a step back from whatever is out there today. In fact, it is better and more focused product in many ways.
With chatbots really staring around 2016, and now all this crypto, metaverse and now LLM/AI crap - it all feels like a push to use more compute and sell us new hardware. But is it really delivering to a need? Do we really need it? It's all trying to be the killer solution, but is it really? And now the whole Dead Internet Theory...I really think there is truth to it. These large companies have so much revenue riding on these stupid ads, it would make sense for them to develop tools that imitate humans and generate more ad impressions for them so they can report traffic/views and collect from the companies who advertise with them. Tech is no longer tech, it is informercials because when your revenue stream comes from pushing ads, what are you? Is this what we want?
Could we perhaps eventually go back to the simpler, more efficient, less intrusive, not personal data-collecting, no-advertising ways of the Amiga days? Is it already happening with social media and site traffic shrinking?