Interesting. Samsung making a bold move here, but one that could make sense.
Samsung: We make fun of Apple until we copy them outright.
See also: removing ports, having a notch
Let apple take the flak for moving the market and then quietly copy because of course it’s more lucrative… classic.
It’s only “moving the market” because of cargo-cult behavior from other manufacturers. Samsung especially.
Apple was like the third phone with a notch. That’s Essential’s claim actually.
And Motorola had true wireless earbuds earlier, etc.
Apple is about polish, not novelty, but a ton of people are obsessed with the idea of Apple as being “groundbreakers” everywhere.
Apple’s focus on polish makes them a not-1st-mover. The Apple vision pro isn’t the 1st virtual or augmented reality device. The Airpods max isn’t the 1st pair of wireless headphones.
This device has a notch?
Also when did Samsung attack apple for using M1/M2/M3?
Samsung Galaxy Tab S8/9 Ultra. Publically made fun of Apple’s notch, then released their tablets with a notch a few months later. (Although tbf nowhere near as pronounced as Apple’s and mostly justified due to the extremely thin bezels).
ARM is great on Linux where almost everything has an ARM version and apple can simply mandate that everyone supports it, but where are you going to find windows programs compiled for ARM?
The only reason Windows is still relevant is a massive volume of legacy x86 applications.
If that laptop won’t support x86 emulation, it’d be actually worse that Linux ARM laptop.
That’s one thing macOS does well: legacy support— at least for x64.
for now…
I have been running Windows 10+11 on arm for years now, the next version of Windows Server 2025 already has an arm preview release. Windows ARM has for a long time had x86 emulation, and has supported x64 emulation since about the start of COVID.
Is it actually emulation? Macs don’t do that.
They convert the x86 code into native ARM code, then execute it. Recompiling the software takes a moment, and some CPU instructions don’t have a good equivalent, but for the most part it works very well.
MacOS does use the term translations for its Rosetta Layer while Windows Arm uses the term emulation. I do believe the technical difference is that MacOS converts x64 code to arm64 on the fly, while part of the reason for emulation on Windows is to support x86 and other architectures. Someone more knowledgeable than me may be able to better compare the two offerings.
macOS converts x86 code to ARM ahead of launching an app, and then caches the translation. It adds a tiny delay to the first time you launch an x86 app on ARM. It also does on-the-fly translation if needed, for applications that do code generation at runtime (such as scripting languages with JIT compilers).
The biggest difference is that Apple has added support for an x86-like strong memory model to their ARM chips. ARM has a weak memory model. Translating code written for a strong memory model to run on a CPU with a weak memory model absolutely kills performance (see my other comment above for details).
They did a good job when moving from os9-osx. Adobe took a looong time to move to osx
I thought MacOS barely does any legacy support because apple isn’t afraid to cut support for old stuff unlike Microsoft.
Windows is relevant because it’s a better product for the average user. The same goes for OSX. ARM isn’t going to change any of that. Especially with NVIDIA GPUs being broken and a pain in the ass.
Windows is not a ‘better’ product, that would be ChromeOS. Zero configuration means nothing can get broken.
The average user who started with MS Office 95 is now 50 years old. The younger average user at least knows there are alternatives to Windows.
PC gaming is a whole other can of worms. I keep hearing that Valve did some black mahic and now most of Steam games work on Linux with no issues.
I’ve been gaming on linux for about two years now through steam proton and it’s really good. Some games don’t run because of anti cheat, some games run even better than on windows.
Gaming on Linux has come a long way and I always prefer to run it on Linux rather than a dedicated Windows boot, if possible.
But if you rely on VRR, DLSS and have a decent HDR display, Linux unfortunately still isn’t quite there yet. VRR/HDR is mostly unsupported systemwide currently. DLSS sometimes works, sometimes requires a lot of debugging and ends up actually hurting the performance.
If your hardware setup allows you to run your games at a decent framerate without DLSS/VRR, this likely won’t be an issue for you.
Google Docs is the only meaningful competitor to Office. No one I know wants to try Linux desktop and I think it’s hard to convince anyone to give up the convenience of Windows. Proton works but in my experience requires too much experimentation for the average user.
Any program written for the .net clr ought to just run out of the box. There’s also an x64 to ARM translation layer that works much like Apple’s Rosetta. It will run the binary through a translation and execute that. I have one of the windows arm dev units. It works relatively well except on some games from my limited experience.
Any program written for the .net clr ought to just run out of the box.
Both of them?
There’s also an x64 to ARM translation layer that works much like Apple’s Rosetta.
Except for the performance bit.
ARM processors use a weak memory model, whereas x86 use a strong memory model. Meaning that x86 guarantees actual order of writes to memory is the same as the order in which those writes executes, while ARM is allowed to re-order them.
Usually it doesn’t matter in which data is written to RAM, and allowing for re-ordering of writes can boost performance. When it does matter, a developer can insert a so-called memory barrier, this ensures all writes before the barrier are finished before the code continues.
However, since this is not necessary on x86 as all writes are ordered x86 code does not include these memory barrier instructions at the spots where write order actually matters. So when translating x86 code to ARM code, you have to assume write order always matters because you can’t tell the difference. This means inserting memory barriers after every write in the translated code. This absolutely kills performance.
Apple includes a special mode in their ARM chips, only used by Rosetta, that enables an x86-like strong memory model. This means Rosetta can translate x86 to ARM without inserting those performance-killing memory barriers. Unless Qualcomm added a similar mode (and AFAIK they did not) and Microsoft added support for it in their emulator, performance of translated x86 code is going to be nothing like that of Rosetta.
The biggest advantage Apple has is they’ve been breaking legacy compatibility every couple years, training devs to write more portable code and setting a consumer expectation of change. I can’t imagine how the emulator will cope with 32bit software written for the Pentium II.
Qualcomm has a pretty fast emulator for the growing pains. Microscope offers arm versions for most of their software
But many open source projects could.be cross compiled it wouldn’t be long if these things start selling.
Disagree, I run a MacBook m1 and enjoy it mostly because everything is compiled for arm. The very few software running through Rosetta are slow to launch, drain battery and less performant. If you were to run x64 on arm it just kill the interest of arm: battery becomes just as bad as on x64, performance is worst.
Disagree with your disagreement. I also have an M1 and was a quite early adopter (within 3 months of launch). It was really snappy compared to my Intel Air it replaced. From the get-go. Even for apps that were still x86 code.
Things definitely improved over the next 9 months, but I was and am a really happy camper.
Intel Air doesn’t count. Those were dogshit processors
Well decent processors, just laughably bad cooling design
I didnt use to run macOS before that but I’m surprised it could be worst, when using app through Rosetta it takes at least 2 or 3 seconds to launch and there is 20-30% more cpu usage. Although Its on my very limited pool of app (2) but even then you would be crazy to run macOS before, it would be more expensive than windows laptop for way less performance.
Qualcomm has a pretty fast emulator for the growing pains.
With how much, 10% or 20% performance loss? Better buy x86 then.
I don’t know what these chips are like, but x86 software runs perfectly on my ARM Mac. And not just small apps either, I’m running full x86 Linux servers in virtual machines.
There is a measurable performance penalty but it’s not a noticeable one with any of the software I use… ultimately it just means the CPU sits on 0.2 load instead of 0.1 load and if it spikes to 100% then it’s only for a split second.
I recently bought an M1 Max and I definitely regret migrating data from my Intel MacBook. I’ve had to reinstall nearly all the apps anyway. Less compatible than I was expecting. Overall happy with it.
MS has been working on ARM for years. To think otherwise is naive.
Oh yeah but has anyone else?
Linux has been a fully working on arm for much longer than Windows so there’s that
That’s not what I meant. Microsoft has been working on Windows ARM, sure, but has anyone else been working on Windows ARM? As far as I know you can’t even get Firefox on ARM.
I suppose that they have a compatibility layer, but it’s nowhere near the performance of Rosetta 2.
Ah well, Firefox definitely has a Windows arm native build available on their website but yeah most applications certainly won’t
Oh yeah, I stand corrected. I’m surprised I didn’t see it before.
Sure - but apple has been “working on” ARM since 1981. Microsoft is definitely on the back foot here.
I think Qualcomm is probably charging far too much for the SoC. Their pricing has been super high for years because they know nobody is matching their performance on the mobile space. Not sure how much of it is the smaller process nodes too.
Isn’t that a bubble? Phones are 10x more performant than they need to be anyway. Not like in gaming/server market where it’s always too slow, no matter how fast.
My phone is now 7 years old and it still works perfectly. Maybe not the newest of the newest games, but i don’t care for games on my phone anyways. And the amazing contributors keeping lineageos up to date for my phone model makes me not need a newer phone :)
People said that 10 years ago and all these phones are barley usable now.
Yeah it’s honestly quite impressive. Software developers have managed to take orders of magnitudes of Hardware improvements over the years and keep Pace ensuring that software still runs like complete utter trash garbage
Not to speak of the battery life that hasn’t improved at all.
Battery life has definitely improved, my old android phones (Moto 2, HTC Evo 4g, Samsung Galaxy s2, nexus 5) wouldn’t usually last a day, meanwhile my fold 2 now lasts 2-3
My understanding is that Apple had bought up all of TSMC’s 3nm capacity in 2023. That exclusivity may be up now explaining why Qualcomm is selling chips based on 3nm. Looks like they are working with Samsung and TSMC on this chip. This article is bizarre as it underplays the reason someone would buy this laptop. Long battery life, low heat, high performance thin/light is very valuable. Not everyone wants to play games. Will be interesting to see if Microsoft delivers.
If it is not cheaper than x86 then people will just keep buying x86 computers.
If power consumption is lower, that means can have a more compact cooling. There’s a lot of people who would pay the premium for longer lasting and lighter laptops, myself included.
Yep, yep. Also their ARM chips might quickly become more powerful than x64 ones as is the case with Mx ones. At least when it comes to laptops. The article is really weird by focusing on gaming experience, is this really a big market for laptops?
An arm steamdeck that could run games would be sick.
I bed valve is already experimenting with box86 and box64.
I think you can find videos of Skyrim running poorly on a raspberry pi.
Idk Mx chip. I googled. Is it a chip for smart home devices?
The demand for gaming 💻 has been high. Firms made names like Rog, Legion and Predator as they’ve wanted to give a considerable amount of focus to gaming.
Also, there can be high profit in high-end gaming 💻.
There’s high profit too in thin and low-mass 💻. For example, enterprise sales. Say a firm with 10000 workers buys 💻 from Asus. High profit.
Idk which generates more profit.
If power consumption is lower
Is it tho?
If it’s anything like the new MacBooks, then hell yes. I can go full days at the office, programming, without a charger. My old dell xps would crap out after 2 hours, tops.
Edit: I would come home with 60% battery left on the MacBook.
I appreciate Dave2d’s “M2 Macbook air versus Windows” vid. Hope he’ll make an M3 Macbook air versus Windows.
If it isn’t, then there is almost no point in going ARM for Windows. Apple demonstrated that it can be quite lower or better perf at same power consumption.
Yes
That is a fair point.
meanwhile the long gone RISC hype train
Do you mean RISC V
Yeah i was wainting for that ? Did they ever had any plan to do so ?
ARM is RISC (or at least a version of it).
I won’t write them off before I’ve owned one, I imagine they could be good for things like battery life but I’m not sure if they’d be an improvement over other chips like ryzen apus.
Will be curious to see the advantage and disadvantages.
Samsung uses their competitors chips? Kinda weird to see
What, how is Qualcomm competing with Samsung?
Apple uses Samsung hardware, btw.
Samsung make exynos chips which are arm. But Samsung even uses qualcomm in their phones in other regions so it’s not unusual
Exynos are subpar to Qualcomm arm chips, or at least they were not so far ago.
They still are
To surprise of no-one :)
Not sure what you mean, they’ve always used Snapdragons? The S23 from 2023 uses one, and the S3 from 2012 uses them in some models, and most galaxies between those do as well.
I know Windows does ARM to x64 translation decently, but does the chip also feature special hardware functionality to aid this, like the M chips (TSO for example)?
I think these ARM chips are more expensive than we realize! Apple’s egregiously high upgrade pricing on MacBooks sucks, and 8gb of RAM by default on the base model sucks as well, but it is likely to raise the average sale price of devices equipped with their chips. This has been known for some time, I feel.
I’ll cut Samsung some slack since we don’t know the unit cost of the Snapdragon chips, and they aren’t likely to sell out of these devices right away even with competitive pricing because of the state of Windows on ARM. I’m excited to see how Linux support pans out on the next generation of non-Apple ARM notebooks, though; I think this is a chance for some manufacturers to take Linux more seriously, as Linux on ARM is actually not a terrible experience.
Or it is just corporate greed. Samsung would love to position something that is just okay into a premium price tier and not have to pay Intel. Sure they’re going to pay Qualcomm instead but you can bet that Qualcomm is giving some great introductory prices to their early partners.
For example Apple uses HBM instead of DDR5. They also give the CPUs heaps of L1/L2/L3 cache to avoid memory access as much as possible. And some of the stuff they do with flash memory is just as expensive.
That’s the real reason Apple Silicon Macs cost so much and I’m more than willing to pay that price. But it’s also the reason those Macs are so fast.
How does Qualcomm compare? I have no idea.
Unless something changed, I believe Apple is using LPDDR5 since the M2. https://www.tomshardware.com/news/apple-introduces-m2-processor-8-core-cpu-10-core-gpu-up-to-18-more-performance
So this SOC benchmarks on par with AMD’s best integrated GPU? On par with the M3, but not the M3 Pro/Max. If I’m going to switch to Windows, I’m not going to buy a less powerful PC that’s less capable than an AMD unit with a discreet GPU lmao. Call me when these are on par with 4080/4090 lol.
This is an ARM SoC. You buy your laptop with one of these because the battery should last 20-30 hours and it still gives good CPU performance at the same time. Not for gaming lol