Even if it was like 16GB on a PC still not worth $1.6k
Especially when 16g is something like $50.
At consumer prices. There’s no way Apple doesn’t pay wholesale rates for memory.
they have the memory controllers built into their processors now. So adding memory is even cheaper, it just takes the modules themselves
With Apple’s new iBits™ the 0s are so much rounder and the 1s are so smooth and shiny that they’re worth at least twice as much as regular bits.
I can’t wait for my iBits. Also the fact that iBytes have ten iBits is revolutionary. 25% more computing power in each iByte!
It’s actually about the bandwidth: https://eclecticlight.co/2020/11/11/how-unified-memory-blows-the-socs-off-the-m1-macs/
The bandwidth provided by unified memory is just unparalleled because of the tightly integrated components found on Apple Silicon.
“Unparalleled”, huh? So I’m sure gamers have fully embraced Apple hardware because it’s objectively better, correct? You surely have links to benchmarks of Apple devices beating the pants off PCs… Right??
Just upgrade the RAM yourself.
Oh wait, you can’t because it’s 2023 and it’s become inexplicably acceptable to solder it to the motherboard.
It’s not “inexplicable”.
DIMM mounting brackets introduce significant limitations to maximum bandwidth. SOC RAM offers huge benefits in bandwidth improvement and latency reduction. Memory bandwidth on the M2 Max is 400GB/second, compared to a max of 64GB/sec for DDR5 DIMMs.
It may not be optimizing for the compute problem that you have, and that’s fine. But it’s definitely optimizing for compute problems that Apple believes to be high priority for its customers.
8GB for this price in 2023 is a SCAM. All Apple devices are a SCAM. Many pay small fortunes for luxurious devices full of spyware and which they have absolutely no control over. It’s insane. They like to be chained in their golden shackles.
deleted by creator
Just an example: If Apple simply wants to turn your iPhone into a brick, it can do that and there is no one who can reverse it.
Um. No they can’t. The class action lawyers would have a field day with that.
They already do so with apps.
If Apple deems the app too old, then it won’t be compatible and is as useful as a brick.
they have the power to do it, is what I’m saying
I don’t trust MacOS, its proprietary code obviously hides evil spying and control functions over the user. Apple has always been an enemy of the free software community because it is not in favor of its loyal customers but only its greedy shareholders. There is no balance, Apple has always adopted anti-competitive measures. That’s just to say the least.
It took the EU legislation to force them adapt USB 3 charger port. Their consumer base are their cows.
And even though they have USB 3 ports, it’s not even a proper USB 3 port as the lower-end models only support USB 2 speeds (480Mbps max)!
USB 2.0 in 2023 LOL LOL LOL LOL
Lightning was also 480Mbps so I wonder if they just changed the port but kept most of the internals the same
They claim that the die that they use for the M1 chip doesn’t support USB 3 standards but the die that they use for the M1 Pro chip does.
Which is probably true, but they also made the chip so it’s not much of a defence.
yeah, I forgot about that, it’s a USB-C type port.
It’s not even USB 3 it’s USB 2 delivered via USB-C. Because that’s something everybody wants isn’t it, slow charging on a modern standard that should be faster and indeed is faster on every other budget Android phone.
Exactly lol
Apple has always been an enemy of the free software community
Apple is one of the largest contributors to open source software in the world and they’ve been a major contributor to open source since the early 1980’s. Yes, they have closed source software too… but it’s all built on an open foundation and they give a lot back to the open source community.
LLVM for example, was a small project nobody had ever heard of in 2005, when Apple hired the university student who created it, gave him an essentially unlimited budget to hire a team of more people, and fast forward almost two decades it’s by far the best compiler in the world used by both modern languages (Rust/Swift/etc) and old languages (C, JavaScript, Fortran…) and it’s still not controlled in any way by Apple. The uni student they hired was Chris Lattner, he is still president of LLVM now even though he’s moved on (currently CEO of an AI startup called Modular AI).
Well, look at the annual contribution that Apple makes to the BSD team and see that Apple uses several open source software in its products but with minimal financial contribution. Even more so for a company of this size. Apple only “donates” when it is in its interest that such software is ready for it to use.
That’s too simplistic. For example, the entry level M1 MacBook Air is hands down one of the best value laptops. It’s very hard to find anything nearly as good for the price.
On the high end, yeah you can save $250-400 buying a similarly specced HP Envy or Acer Swift or something. These are totally respectable with more ports, but they have 2/3rd the battery life, worse displays, and tons of bloatware. Does that make them “not a scam”?
(I’m actually not sure what “spyware” you’re referring to, especially compared to Windows and Chromebooks.)
The bloatware really isn’t an arguement because it takes all of 30 seconds to uninstall it all with a script that you get off GitHub. Yeah it’s annoying and it shouldn’t be there but it’s not exactly going to alter my purchase decision.
The M1’s ok value for money, but the problem is invariably you’ll want to do more and more complex things over the lifetime of the device, (if only because basic software has become more demanding), while it might be fine at first it tends to get in the way 4 or 5 years down the line. You can pay ever so slightly more money and future proof your device.
But I suppose if you’re buying Apple you’re probably going to buy a new device every year anyway. Never understood the mentality personally.
My cousin gets the new iPhone every single year, and he was up for it at midnight as well, I don’t understand why because it’s not better in any noticeable sense then it was last year, it’s got a good screen and a nice camera but so did the model 3 years ago. Apple customers are just weird.
But I suppose if you’re buying Apple you’re probably going to buy a new device every year anyway. Never understood the mentality personally.
My cousin gets the new iPhone every single year, and he was up for it at midnight as well, I don’t understand why because it’s not better in any noticeable sense then it was last year, it’s got a good screen and a nice camera but so did the model 3 years ago. Apple customers are just weird.
I think you’re basing your general estimation of the Apple customer on the iPhone customer a bit too heavily. E.g., I have never had an iPhone and wouldn’t ever consider buying one, considering how locked down and overpriced it is, and how competitive Android is as an alternative OS.
Meanwhile, I’ve been on MacOS for something like 7 or so years and cannot look back, for everyday computing needs. I have to use Windows occasionally on work machines and I cannot emphasise enough how much of an absolute chore it is. Endless errors, inconsistent UX, slow (even on good hardware), etc. It is by contrast just a painful experience at this point.
And one of the reasons people buy MacBooks, myself included, is to have longevity, not to refresh it after a year (that’s insane). It’s a false economy buying a Windows laptop for most people, because you absolutely do need to upgrade sooner rather than later. My partner has a MacBook bought in 2014 and it still handles everyday tasks very well.
It’s a false economy buying a Windows laptop for most people, because you absolutely do need to upgrade sooner rather than later.
I think you missed my point.
You want to keep laptops for ages regardless of what OS it was it runs (really not sure how that would have any bearing on spec fall off), but the MacBook M1 is only competitive now, but it won’t be competitive in 4 to 5 years. The chip is good for its power consumption but it isn’t a particularly high performance chip in terms of raw numbers. But the laptop costs as if it is a high performance chip.
There’s no such thing as a Windows laptop you just buy a laptop and that’s the specs you get so not quite sure what you’re comparing the MacBook too.
I 'm not refering to Windows or ChromeOS ( that are full of spyware too ) . The first generation of Mac M1 had a reasonably more “accessible” price precisely to encourage users to migrate to ARM technology and consequently also encourage developers to port their software, and not because Apple was generous. Far from it.Everything Apple does in the short or long term is to benefit itself.
And not to mention that it is known that Apple limits both hardware and software on its products to force consumers to pay the “Apple Idiot Tax”. There is no freedom whatsoever in these products, true gilded cages. Thank you, but I don’t need it. Software and hardware freedom are more important.
I didn’t claim that Apple is doing anything to be “generous”. That seems like it’s moving the goal posts. Say, are other PC manufacturers doing things out of generosity? Which ones?
Even the M2 and M3 Macs are a good value if you want the things they’re good at. For just a few hundred more, no other machine has the thermal management or battery life. Very few have the same build quality or displays. If you’re using it for real professional work, even just hours of typing and reading, paying a few extra hundred over the course of years for these features is hardly a “scam”.
You didn’t elaborate on your “spyware” claim. Was that a lie? And now you claim it’s “known” that Apple limits hardware and software. Can you elaborate?
MacBooks do have excellent screens, software integration and everything else, that’s a fact and I don’t take that away from Apple. But the problem is that it’s not worth paying for this in exchange for a system that is completely linked to your Apple ID, tracking all your behavior for advertising purposes and whatever else Apple decides. Privacy and freedom are worth more. If you can’t check the source code you can’t trust what Apple says, they can lie for their own interests. Have you ever read Apple’s privacy policy regarding Apple ID, for example? If not, I recommend it.
I think that decision makes sense.
What you said got me worried, so I looked into the claim that it is “tracking all your behavior for advertising purposes and whatever else Apple decides”. That’s a convincing concern, and you’ve changed my mind on this. I don’t see any evidence that they’re doing anything close to this level of tracking — the main thing they seem to track is your Mac App Store usage — but they may have the potential to do so in the enshittified future. That gives me pause.
I bought a PC the other day and it only had 6 gigabytes of RAM which is pathetic for what I paid for it but there you go. The thing is for a fraction of the price Apple are asking to upgrade it to 16, I upgraded it to 32 gig.
I honestly think I could upgrade it to 64 and still come in under the Apple price. They’re charging something like a 300% markup on commercially available RAM, it’s ridiculous.
On storage, the markup is about 2000%.
And on RAM if we compare to DDR5 (not totally fair because of how Apple’s unified memory works), it’s about 800% marked up.
Pairing a chip this capable with just 8GB of shared memory is also just a waste of good silicon. Which makes the price all the more insulting to me.
Like, this is the equivalent of Usain Bolt losing one of his legs
“His one leg is still more capable than regular person’s two legs”
That is exactly what Apple would say, isn’t it
The thing is even if that were true, which it isn’t, I’d still prefer him with two legs. Especially if I’m paying the amount of money I would normally pay for 50 legs.
Somewhat stretching the analogy there
They could just sell appropriately specced computers and make absolute bank like they do anyway, but nooo, that would be too nice of them.
I don’t necessarily even begrudge them a profit margin on RAM. Sure it’s kind of a scam but also I guess it’s just the price you pay for convenience. If you want to better price you upgrade the RAM yourself (assuming that was actually possible).
But the markup they have on RAM isn’t reasonable it’s totally insane.
If you went to McDonald’s and cheeseburger was $0.99 and then a cheeseburger with extra cheese was $2 do you think was something was up but that’s essentially what Apple are doing. Cheese does not cost $1.99, you are literally almost doubling the price for a subcomponent
Apple fans have a very different definition of the word convenience than I do, then.
It’s so annoying. They have the whole design industry by their balls with their great displays and perfect colour management in MacOS.
Putting more RAM in those models, or just cutting the lower-end models out entirely would do them no harm at all.
It’s always nice to have as many legs as possible. I love legs
Somewhat stretching the analogy there
Your analogy is looking a bit leggy at this point.
Seems fair, you pay 1000 for the logo and 600 for the hardware.
It’s a very nice logo. And it lights up. Hard to argue with their pricing, really.
It actually doesn’t light up anymore…
For $375 you can get an iFlashlight to point at the logo
Ordered the iFleshlight. Looking forward to seeing the jealous looks I get at the coffee shop.
It’s actually just the display backlight which is why I had to cover it with aluminium tape instead of just disconnecting the wire. Not only don’t I want an ad on my computer I especially don’t want an illuminated one.
Apple laptops and gaming headphones, keeping it classy
Instead I feel it’s the opposite because that memory is shared with the GPU. So if you’re gaming even with some old game, it’s like having 4gb for the system and 4gb to the GPU. They might claim that their scheduler is magic and can predict memory usage with perfect accuracy but still, it would be like 6+2 GB. If a game has heavy textures they will steal memory from the system. Maybe you want to have a browser for watching a tutorial on YouTube during gaming, or a chat. That’s another 1-2 gb stolen from the CPU and GPU.
Their pricing for the ram is ridiculous, they’re charging $300 for just 8gb of additional memory! We’re not in the 2010s anymore!
The most expensive 8GB DDR5 stick I can find on Amazon is about us$35. There are 64GB sets that are under us$200!
Apple should be ashamed.
Maybe you want to have a browser for watching a tutorial on YouTube during gaming, or a chat. That’s another 1-2 gb stolen from the CPU and GPU.
Or five times that amount if you’re running Chrome
deleted by creator
a toy for professional workloads
[rant]
I think this is one of those words which has lost its meaning in the personal computer world. What are people doing with computers these days? Every single technology reviewer is, well, a reviewer - a journalist. The heaviest workload that computer will ever see is Photoshop, and 98% of the time will be spent in word processing at 200 words per minute or on a web browser. A mid-level phone from 2016 can do pretty much all of that work without skipping a beat. That’s “professional” work these days.
The heavy loads Macs are benchmarked to lift are usually video processing. Which, don’t get me wrong, is compute intensive - but modern CPU designers have recognized that they can’t lift that load in general purpose registers, so all modern chips have secondary pipelines which are essentially embedded ASICs optimized for very specific tasks. Video codecs are now, effectively, hardcoded onto the chips. Phone chips running at <3W TDP are encoding 8K60 in realtime and the cheapest i series Intel x64 chips are transcoding a dozen 4K60 streams while the main CPU is idle 80% of the time.
Yes, I get bent out of shape a bit over the “professional” workload claims because I work in an engineering field. I run finite elements models and, while sparce matrix solutions have gotten faster over the years, it’s still a CPU intensive process and general (non video) matrix operations aren’t really gaining all that much speed. Worse, I work in an industry with large, complex 2D files (PDFs with hundreds of 100MP images and overlain vector graphics) and the speed of rendering hasn’t appreciably changed in several years because there’s no pipeline optimization for it. People out there doing CFD and technical 3D modeling as well as other general compute-intensive tasks on what we used to call “workstations” are the professional applications which need real computational speed - and they’re/we’re just getting speed ratio improvements and the square root of the number of cores, when the software can even parallelize at all. All these manufacturers can miss me with the “professional” workloads of people surfing the web and doing word processing.
[\rant]
deleted by creator
Indeed! It makes the benchmarks that much more disingenuous since pros will end up CPU crunching. I find video production tedious (it’s a skill issue/PEBKAC, really) so I usually just let the GPU (nvenc) do it to save time. ;-)
deleted by creator
the mac pro is a terrible deal even compared to their own mac studio. It has the same specs but for almost $1000 extra. Yes, the cheese grater aluminum case is cool, but $1000 cool?
Also, one of these days AMD or Intel will bolt 8GB on their CPUs too, and then they’ll squash M.
I can’t remember who it is but somebody is already doing this. But it’s primarily marketed as an AI training chip. So basically only Microsoft and Google are able to buy them, even if you had the money, there isn’t any stock left.
Apple exec doesn’t actually understand how computers work and think that that actually might be a reasonable arguement.
It doesn’t matter how good your processor is if you can only bank 8 GB of something into memory it’s going to be slow. The only way an 8 GB device would beat a 16 GB device would be if the 16 GB device had the world’s slowest processor. Like something from 2005. Taking stuff out of RAM is the single slowest operation you can perform other than loading from a hard drive.
Apple exec doesn’t actually understand how computers work and think that that actually might be a reasonable arguement
I think a lot of Apple users fit this bill too so it doesn’t matte much if this is the messaging, a fair amount of people will believe it.
Do they store 32-bit integers as 16-bit internally or how does macOS magically only use half the RAM? Hint: it doesn’t.
Even if macOS was more lightweight than Windows - which might well be true will all the bs processes running in Windows 11 especially - third party multiplatform apps will use similar amounts of memory no matter the platform they run on. Even for simple use cases, 8 GB is on the limit (though it’ll likely still be fine) as Electron apps tend to eat RAM for breakfast. Love it or hate it Apple, people often (need to) use these memory-hogging apps like Teams or even Spotify, they are not native Swift apps.
I love my M1 Max MacBook Pro, but fuck right off with that bullshit, it’s straight up lying.
Do they store 32-bit integers as 16-bit internally or how does macOS magically only use half the RAM? Hint: it doesn’t.
As a Mac programmer I can give you a real answer… there are three major differences… but before I go into those, almost all integers in native Mac apps are 64 bit. 128 bit is probably more common than 32.
First of all Mac software generally doesn’t use garbage collection. It uses “Automatic Reference Counting” which is far more efficient. Back when computers had kilobytes of RAM, reference counting was the standard with programmer painstakingly writing code to clear things from memory the moment it wasn’t needed anymore. The automatic version of that is the same, except the compiler writes the code for you… and it tends to do an even better job than a human, since it doesn’t get sloppy.
Garbage collection, the norm on modern Windows and Linux code, frankly sucks. Code that, for example, reads a bunch of files on disk might store all of those files in RAM for for ten seconds even if it only needs one of them in RAM at a time. That burn be 20GB of memory and push all of your other apps out into swap. Yuck.
Second, swap, while it’s used less (due to reference counting), still isn’t a “last resort” on Macs. Rather it’s a best practice to use swap deliberately for memory that you know doesn’t need to be super fast. A toolbar icon for example… you map the file into swap and then allow the kernel to decide if it should be copied into RAM or not. Chances are the toolbar doesn’t change for minutes at a time or it might not even be visible on the screen at all - so even if you have several gigabytes of RAM available there’s a good chance the kernel will kick that icon out of RAM.
And before you say “toolbar icons are tiny” - they’re not really. The tiny favicon for beehaw is 49kb as a compressed png… but to draw it quickly you might store it uncompressed in RAM. It’s 192px square and 32 bit color so 192 x 192 x 32 = 1.1MB of RAM for just one favicon. Multiply that by enough browser tabs and… Ouch. Which is why Mac software would commonly have the favicon as a png on disk, map the file into swap, and decompress the png every time it needs to be drawn (the window manager will keep a cache of the window in GPU memory anyway, so it won’t be redrawn often).
Third, modern Macs have really fast flash memory for swap. So fast it’s hard to actually measure it, talking single digit microseconds, which means you can read several thousand files off disk in the time it takes the LCD to refresh. If an app needs to read a hundred images off swap in order to draw to the screen… the user is not going to notice. It will be just as fast as if those images were in RAM.
Sure, we all run a few apps that are poorly written - e.g. Microsoft Teams - but that doesn’t matter if all your other software is efficient. Teams uses, what, 2GB? There will be plenty left for everything else.
Of course, some people need more than 8GB. But Apple does sell laptops with up to 128GB of RAM for those users.
Almost all programs use both 32bit and 64bit integers, sometimes even smaller ones, if possible. Being memory efficient is critical for performance, as L1 caches are still very small.
Garbage collection is a feature of programming languages, not an OS. Almost all native linux software is written in systems programming languages like C, Rust or C++, none of which have a garbage collector.
Swap is used the same way on both linux and windows, but kicking toolbar items out of ram is not actually a thing. It needs to be drawn to the screen every frame, so it (or a pixel buffer for the entire toolbar) will kick around in VRAM at the very least. A transfer from disk to VRAM can take hundreds of milliseconds, which would limit you to like 5 fps, no one retransfers images like that every frame.
Also your icon is 1.1Mbit not 1.1MB
I have a gentoo install that uses 50MB of ram for everything including its GUI. A webbrowser will still eat up gigabytes of ram, the OS has literally no say in this.
My 32/16 bit integer example was just that: an example where one was half the size as the other. Take 128/64 or whatever, doesn’t matter as it doesn’t work like that (which was my point).
Software written in non-GC based languages runs on other operating systems as well.
I used MS Teams as an example, but it’s hardly an exception when it comes to Electron/WebView/CEF apps. You have Spotify running, maybe a password manager (even 1Password uses Electron for its GUI nowadays), and don’t forget about all the web apps you have open in the browser, like maybe GMail and some Google Docs spreadsheet.
And sure, Macs have fast flash memory, but so do PC notebooks in this price range. A 990 Pro also doesn’t set you back $400 per terabyte, but more like … $80, if even that. A fifth. Not sure where you got that they are so fast it’s hard to measure.
There are tests out there that clearly show why 8 GB are a complete joke on a $1600 machine.
So no, I still don’t buy it. I use a desktop Windows/Linux machine and a MacBook Pro (M1 Max) and the same workflows tend to use very similar amounts of memory (what a surprise /s).
I felt getting ripped off by just reading the article. My recent PC build has 32 GB, is cheaper and the upgrade to 64 GB (meaning additional pair of 16 GB) only costs me around 100 Euros. It’s nice that their devices are probably more effective and need less RAM, which the iPhones proved to be correct. But that does not mean the cost of the additional RAM units are more expensive. Apple chose to make them expensive.
16 gb optiplexes on sale for 85 dollars on eBay. Dont come with windows, but neither do macs :P
Tell that to Google Chrome
I looked at a few Lenovo and MS laptops to see what they are charging to jumps from 8 to 16 GB.
They are very close to what Apple charges.
So, they are ALL ripping us off!I just got a laptop with 64GB of DDR5 ram for $870 or so from HP, so I wouldn’t take these specific examples you found as gospel.
I switched back to Apple recently, but used to sell them.
1 week before Bootcamp was released, I was selling Apple gear, and I showed a sales manager who was visiting how we got Windows running on the new Intel Mac Mini, and explained how this was great, because it was a great transition technology
In front of customers, as I was explaining, he basically called me an idiot, and said “why would anyone want to run windows on a mac”.
A week or so later, bootcamp was released, and he was back… He was now using the arguments I made a week early as a template for bragging about bootcamp to us and explaining the benefits. No apologies for any of the previous discussion.
They make decent products otherwise, and management doesn’t even need to act like wankers or be deceptive either
I only now using Apple again because Microsoft has finally pushed me over the edge with windows (literally, when they started hijacking my chrome tabs EVERY bootup, and opening Edge automatically), and the fact my Xbox Series X wouldn’t even play remote on Windows (their own OS)
deleted by creator
Absolutely agree. Unfortunately, Apple attracts the kind of idiots too who think they know what they’re talking about too. When I was selling them, I had a customer tell another that Macs can’t get viruses as I was talking to them.
I used a lot of Linux in the past too (everything from playing unreal tournament on Gentoo in 3DFX days to Ubuntu more recently), and unfortunately, in the past Linux tended to also attract the upstuck crowd too.
But, slowly, the LInux culture does seem to be changing. But, we still regularly see people argue about things like SystemD vs Init Scripts (and anyone who has ever written a Init script knows exactly what a pile of crap they are to write) and Pulseaudio vs AlSA/OSS/ESOUND/ETC (whereas, any old school user also remembers the pain of sound servers conflicting with each other). Linux does finally appear to be on the right path to improving things, improve interoperability and the general common sense crowd finally seems to be drowning others out (and new technologies like Wayland or Pipewire are no longer getting heavy blowback). It may also be because Linux developers these days tend to be a lot better at communicating the benefits (Compiz was another case where the benefits were well communicated).
There’s a lot of things honestly Apple should be fixing
Why did you decide to go back to Apple instead of giving Linux a try? It’s free so it literally would have cost nothing to try, and you could keep your other OS(s).
I used to use Linux exclusively (I was actually the top poster on a few major Linux news sites, and my linux project once got published in LinuxWorld Magazine).
Whilst it has certainly gotten better, I still feel some parts of linux need refining. Also, one thing both Microsoft and Apple do have is available integration of mobile apps… I thought Apple could do both via Parallels (android in windows, iPhone in MacOS), but turns out Android in Windows on parallels won’t work.
For the type of development i do, windows and macos are still the best options unfortunately too. If Linux had more seamless mobile app integration, I probably would have highly considered it to be honest
By mobile app integration, do you mean a connection between your mobile phone and your computer? KDE Connect is pretty good from my experience. It has more features than the Windows alternative at least (and I think there’s even a Windows version oddly enough).
If you mean running a mobile app in the system, I have no experience with that.
Running mobile apps on computer. It’s really the one use case Apple does extremely well, and it’s a pity because Linux could actually do it well if distros sorted themselves out