www.yahoo.com/tech/qualcomm-snapdragon-x-elite-laptops-130150491.html
Qualcomm Snapdragon X Elite laptops suffer compatibility issues with many games — even Intel's integrated Arc Graphics is up to 3x faster
InfoThe OP in particular has an axe to grind I’ve noticed.
"Were they all paid by Qualcomm?" is looney-tunes baseless conspiracy stuff that should just be removed from this sub.
EDIT: OP went back to a thread from several days ago to pester me on an unrelated topic. They will go harass other users if those users are critical of them.
Yeah, if anything, it seems like the opposite in this case.
To be fair there were a ton of sponsored reviews of the asus laptop.
Which was the worst showing of all the devices we've seen thus far. So...
The laptop was fine. But the bigger point is that those reviews were generally extremely positive and avoided the difficult situations.
It's a question, not a statement of fact. The early reviews posted before the general reviewers were almost universally positive. They often overlooked the glaring weaknesses.
It's a question, not a statement of fact.
It's trolling. And ironic, given how much blatant misinformation you've been spreading on this topic. Should I thus assume you must be paid by one of QC's competitors? Much stronger argument than your accusation.
What misinformation did I spread? That's ridiculous. I post articles on a wide variety of topics. At most, I am a very disappointed consumer who was excited by the pre-release reviews but greatly let down by the subsequent real, independent reviews.
What misinformation did I spread?
For example, claiming VPNs flat out don't work.
I post articles on a wide variety of topics
Lmao, sure you do. Which is why they all happen to be on one subject, and one specific "opinion" on that subject?
who was excited by the pre-release reviews but greatly let down by the subsequent real, independent reviews
Then why lie, and only link blogspam? Plenty of independent reviews (and better ones, at that) have come away with a positive impression.
This blatant dishonesty makes it seem increasingly likely that you have an ulterior motive for this behavior, financial or otherwise.
It's a question, not a statement of fact.
A loaded question, yes, which are used to disguise statements as questions. Do you still beat your wife?
I am a wife.
A wife that still beats her wife, eh? Women can marry other women, genius. It's 2024.
But I'm not married to a woman. What are you even talking about now? This is a very strange comment.
What are you even talking about now?
Loaded questions. Way to miss the point, you troll. Get outta here with your spam.
This sub seems very focused on the poor gaming from these chips
It's extra weird, because you could make almost the exact same statements for Apple's chips. But no one's claiming they suck.
Edit: Looks like many of these articles are just being posted by this one OP. Seems like he has some kind of personal vendetta, and that's being generous.
Lemaooo
Yeah, the question more and more becomes: who are those machines for then?
Everyone always goes "But most people only use a browser and MS Office", but is that even true? And if it is, are those really the same people buying a 1300€ premium Windows laptop?
It's certainly not true in the corporate world, most companies have some weird specialty or legacy tools they need.
It's also not true for lots of college students depending on the degree.
It might be true for middle and high school students, but don't schools typically choose the devices? Mostly Chromebooks and iPads? The pricing will be a no go.
Everyone always goes "But most people only use a browser and MS Office", but is that even true? And if it is, are those really the same people buying a 1300€ premium Windows laptop?
Look at Apple's market. Gaming is essentially non-existent. Qualcomm targets the same.
Apple supports a lot of specialty applications. In some fields apple devices are the primary hardware used. Just as an example, apple has for a long time (long before even Intel Macs) had very big market share in audio production. Apparently those applications don’t work on windows on arm at all at the moment.
It’s a bit misleading to say most people just need browser and office. All the people need those but most people have some niche they work on.
Microsoft probably needs to push more money to have people translate their software to arm before Qualcomm gets market share.
It’s a bit misleading to say most people just need browser and office. All the people need those but most people have some niche they work on.
But that's simply not the case. The average user is covered by those alone. Then you have native support for development, and also native Adobe suite (with some issues being worked on), and that surely covers most. Yet if you listen to these claims, the laptops are literally unusable.
Are you sure you are not confusing statistics like “99% of people use office but only 3% use statistics software” with “99% use only office”? There are hundreds of different software niches for different professionals.
The laptops are currently unusable for very large share of the market. Including basically everyone using adobe suite (except maybe photoshop). “Some issues being worked on” means no professional will buy these before those kinks have been worked out. People using adobe suite are paid to get work done, not to debug problems. And adobe suite covers large part of graphics market and some of video production but not much else.
Then there is the problem that adobe alone is not enough. You also need a plethora of external plugins and other 3rd party software to work. Qualcomm hardware seems to be fine but reliability of the ISA translation layer looks like a disappointment to me.
As I said, there is a lot of work to do on software side before these laptops can be actually recommended. And Microsoft probably needs to put some money on that because a lot of software houses don’t have incentive to invest on this before windows on arm has market share.
Are you sure you are not confusing statistics like “99% of people use office but only 3% use statistics software” with “99% use only office”? There are hundreds of different software niches for different professionals.
It's absolutely true for the average consumer and most business users. Sure, niches absolutely exist, but the exact same argument (and teething issues) were used against Apple when they transitioned to ARM. But in reality, even early adopters loved their M1 notebooks.
Almost everything just worked when apple made the transition. But there is another factor in that with apple everybody had to quickly make the native version because x86 Mac would be gone very quickly. That’s not the case with windows on arm. For software vendors it looks like another software to support with relatively small market share.
Almost everything just worked when apple made the transition
That wasn't true though. Some of the exact same issues persisted. What remained of Mac gaming was further crippled, extensions for both audio and visual production didn't work, etc. I fully agree that the Apple ecosystem had the incentive to transition faster, but these are inherently transient problems.
And WoA is clearly here to stay. Qualcomm, not Intel, is Microsoft's flagship Windows partner going forward. And Nvidia will surely try to make a big splash when they join later. Plus, they also benefit a ton from the work done for ARM Mac support. That should help accelerate things.
We’ll see.
I just post the reviews that are posted - and a lot of those are gaming lately. It seems that Qualcomm made a mistake in feeding the pipeline with all positive news before allowing the general public to do independent reviews.
I just post the reviews that are posted
Then why haven't you posted any of the positive reviews? Instead you keep posting blogspam-tier "reviews", many of which aren't even the primary source to begin with.
Don't lie about your biases.
Other people were busy posting the positive ones for months before the product was released.
So you admit you just lied about "just posting the reviews that are posted". There are plenty of positive, newer articles, and many are way better quality than the blogspam you've resorted to drudging up.
Not at all. This very great, high quality review came up in my feed and I felt it should be shared. I'm not drudging up anything.
This very great, high quality review
Then you're simply lying through your teeth, as already established. This article itself isn't even a review.
Most reviews I saw did tell us that gaming isn’t great on the X Elite or they said they are not going to test it because there are lots of issues.
Thats a questionable decision. Qualcomm marketed these chips as great at gaming, but reviewers dont want to actually test those claims...
This is the problem with most reviews (tech influencers), they run the 2 native ARM benchmarks, launch Chrome and Office, and then test battery life with video and then call the laptops amazing. This launch isnt just a SoC launch, its also making people use WoA which is a big change, so vastly more tests need to be done than what they are doing.
JustJosh, who I never watched before this launch, is one of the handful of reviewers actually doing his job and reviewing the product, and not just doing a marketing campaign.
Qualcomm marketed these chips as great at gaming,
One slide saying it's enough vs MTL in some games, not "great at gaming". The rest is the efficiency of the GPU
This isn't a review, this is barely an article. It just rehashes the (quite poor) PCWorld review, summarizes consumer sentiment around the X Elite, and then gives a vague rundown of other releases in the next few months (with lots of affiliate links).
Without trying to make an easier way to port X86 games to ARM they are simply going to leave performance on table, that's 40-50% performance penalty and not a very energy efficient way to do
Without trying to make an easier way to port X86 games to ARM they are simply going to leave performance on table, that's 40-50% performance penalty and not a very energy efficient way to do
Afaik, the parts of the game that run on the GPU don't need to be converted to ARM at all, so they wouldn't be affected by the x86-64 to ARMv8 translation penalty. Most, if not all of the poor performance in games comes from the subpar GPU hardware and software.
It's already pretty easy to port x86 games to ARM on Windows because the same compilers, etc. exist for Windows ARM as exist for Windows x86. Even Unreal Engine supports Windows ARM64 as a target for game development. The problem is, it requires a dev team to port them (mostly just build the outputs), test them, and release them. With the fact that there are no real gaming ARM systems, I doubt it's worth it to game developers to even bother at this point.
It's already pretty easy to port x86 games to ARM on Windows because the same compilers, etc. exist for Windows ARM as exist for Windows x86
My experience trying to get various Linux software working on arm systems would suggest that's far from the whole story.
Then YOU, my friend are correct.
So you never ported anything to ARM, sure you can compile it easily, but ARM has a weak memory model and there are concurrency bugs waiting everywhere in code written originally for x86 strong memory ordering.
We will see at Hot chips, but what Apple did was make the CPU support strong memory model and Oryon is expected to have done the same
Was* expected
Yeah, I guess that falls under what a real gaming system in ARM would be. There's the whole piece where I said, "With the fact that there are no real gaming ARM systems, I doubt it's worth it to game developers to even bother at this point."
My point is, I kind of addressed this. ARM has been used for gaming and does just fine as long as the system supports it. Apple's M series can be used for gaming. It's all about design and the current Snapdragon X processors are not designed for that kind of gaming.
Why port a game for the few hundred people who will buy these chips and try to game? Not worth it at all.
Dell is expected to sell hundreds of thounsands X Elite Laptops as in their predictions. stop with the garbage hate
If it was that easy, everyone would be doing it.
Easy and effortless are two different things.
Of course, you should never trust pre-release reviews.
What pre-release review showed otherwise?
Most reviews I’ve seen have been much more positive than the gamer bros would ever acknowledge.
Like intel, AMD, and Apple cpus, these aren’t good for gaming.
It’s almost as though gamers are a niche market with specific needs and not representative of the average computer user.
Qualcomm lied about gaming performance, but I don't think many of the reviewers even touched on gaming performance since that's not what these chips are for.
The bigger concern is that performance nearly halves when running applications that aren't compiled natively for ARM.
But they aren't for business use either. The reviews say they aren't even compatible for enterprise use cases. No gaming. No enterprise. These are like Intel chips from 5 years ago, but unable to fill an enterprise need. Will they even run all the college software required? Probably not.
The vast majority of enterprise knowledge workers in the US just use m365 and a chromium browser. Both of which are natively supported on ARM. When people say these chips are for enterprise, they’re not referring to the factory floor.
it's the same problem, since chromebooks cost a small fraction of these. and the argument for these is just putting it in the same class as chromebooks ('target market only needs a browser and office 365')...
People want laptops that perform well.
...when limiting usage to browser/office 365.
and lolz, didn't you make a big deal about blocking me last week? haha
Yes, even with m365 / productivity apps, you can feel the difference of a higher spec’d device in day to day activities. Windows on Arm is snappy. I love using it in parallels on my m2 pro. Noticeably better than my surface laptop, which itself is an i7 with 32gb of ram (enterprise only SKU)
snappier youtube videos? surely, that's worth the premium. i'm sold!
You know exactly what I’m talking about and are being intentionally obtuse
...when limiting usage to browser/office 365.
So like 95+% of usage? And other things like the Adobe suite are coming along.
and lolz, didn't you make a big deal about blocking me last week?
I legitimately don't remember you. Think I typically follow through on blocks.
...cool, so everyone should just buy chromebooks instead. this doesn't seem to be the argument you seem to think it is.
womp, womp
...cool, so everyone should just buy chromebooks instead
By the same argument, no one should buy Macbooks. They also suck for gaming, after all. Should highlight how dumb this take is.
If you actually can read (which at this point seems to be in question), I never said I blocked you. But thanks for proving that you are, indeed, just trolling. But it's funny to watch the denial. Even Intel's own numbers don't show them coming close to what reviews have shows from the Qualcomm chips.
Except the management plane for a windows ARM client is identical to a windows x86 client, which enterprises have been managing for decades
and yet, the point stands. it's terrible at gaming, the efficiency is moot when it needs to emulate x86, and it has compatibility issues - effectively making it akin to a chromebook.
But they aren't for business use either. The reviews say they aren't even compatible for enterprise use cases.
No. This point doesn’t stand. It’s not true. M1 MacBook airs are already dominant in the workplace. Why? Because they run m365 apps and chromium browsers very well, with great battery life, and a lightweight chassis. Despite lacking support for legacy apps without emulation.
These are likely to be very popular in the enterprise space, regardless of your YouTuber reviewers say.
These comparisons to chromebooks are nonsense. Manageability and minimizing adoption/deployment costs are a top priority in enterprise environments. These snapdragon chips can be deployed with zero change to existing processes. I know 200k+ seat customers who already have snapdragon chips being added to their enterprise catalog for procurement within Q3.
lolz @ citing the other guy and inexplicably talking about 'my' youtubers. so tell me, who exactly are my youtubers? show me one instance of me citing any youtubers. oh, right - you can't.
The reviews say they aren't even compatible for enterprise use cases
Such as?
The article said no VPN applications currently work with these X laptops. No VPN, no enterprise. Sorry.
So you continue to blatantly lie. The biggest name in the business, Cisco, got support as far back as 2019.
https://www.windowscentral.com/cisco-anyconnect-secure-mobility-client-gets-arm64-support
Some VPN is not all VPN. Especially when Cisco alone probably accounts for half or more of enterprise. It's the consumer-marketed ones that have issues.
That article directly calls into question the lack of enterprise viability based upon lack of functioning VPN compatibility. It's not my opinion. I don't own one.
And yet, as I pointed out, not only did you lie about the article's claim (it's not all VPNs), but the article doesn't focus on enterprise either. The apps listed there have very low enterprise adoption.
"But that’s not the cardinal sin. No, the fact that most VPN apps don’t work because they don’t yet have native Arm versions might be an absolute deal breaker for some."
I am rehashing from the article. There was a second video review which noted the same. What major companies today operate without VPN requirements today? Can you list some?
Look, these pre-reviews lead us to believe these were going to beat Apple in battery life and Intel and AMD in performance. It turns out neither is particularly true and there are a lot of people frustrated by this. The compatibility issues are really common in most of the articles I am reading.
It's not for gaming, period.
Exactly!
every time i come to this sub i see your negative posts about their laptops lol.
i don’t think these laptops are meant for gaming. i don’t know why reviewers and folks here put so much emphasis on that. i think this is great competition that will benefit consumer. from what i can tell, Intel and AMD can’t slack off, or they risk getting caught up. WoA is catching up fast to x86. i’m a gamer my self but gamers are not the majority of the laptop market. i see my kid and a bunch of college students getting these new Snapdragon laptops. competition is good.
This sub acts like these laptops are supposed to compete with Alienwares and ROGs when their explicit goal is to stop the exodus of basic media consumption and office users to M series chips en masse lol
They aren't my posts. I am linking to articles from top notch reviewers who obviously weren't paid for their reviews.
I am linking articles from top notch reviewers
Except that you just linked an article from a trash tier reviewer that basically summarized another, better review in a more respected publication.
So no, we can dismiss that statement of yours as a lie.
newsflash: Apple emulator also is garbage compared to x86 and rife with bugs, crashes etc.
source: user of m2 macbook pro who tried to use it.
Most macos software is native arm at this point.
As for games, most of my steam library runs under d3d metal.
Most macos software is native arm at this point.
But not games. Which seems to be what 80% of these articles are about. Steam is basically dead on Mac.
That has been the case with macOS even with Intel. The interesting thing here is that emulated wine works far better with windows games, then windows on arm itself. It’s especially evident with older titles.
That has been the case with macOS even with Intel
Yes. So why aren't there a dozen articles claiming Macs are failures?
Which games have you been trying to run?
I’m using an m1 system and I’ve been able to play Elden Ring, GTAV, the Far Crys, and all my other games just fine. The only game I haven’t been able to get running is RDR2
It says Elden Ring gets 23FPS on Apple. I guess you can play anything if you don't care what it looks like
I’ve been running it on max settings @ 60 FPS through crossover+CXpatcher on Sonoma @ 1200p. I don’t know what source you’re citing but from what I can tell, the performance numbers you’re citing seem to reflect that of a base m1 system.
It says
Who is “it”. I’m sure the guy above you isn’t only getting 23fps
Problem with games is with lack of standards in how GPUs work. There’s a lot of software things happening that need specific language in the drivers so some games work. Especially in DX12. Sometimes the game needs a patch sometimes the game and the driver needs a patch.
Because each has different ways of doing shaders and RT in hardware they have to have matching drivers to account for this and while a lot of games do use standard calls those taking advantage of more direct HW calls run into those issues more.
Qualcomm is going through the same pains that Apple did with their M-series chips. Color me not surprised. Give it a year or two for a lot of software to be native software for Qualcomm’s Snapdragon and give it probably another decade before a lot of games will run natively (unless cloud gaming takes over all of gaming).
Not the same situation. Apple made it clear to developers that M-chips was the only SoC going forward. So developers were forced to accept it as the future.
On Windows, the future of ARM is uncertain. Microsoft has been playing with it for a decade, and its still a mess. While the future for WoA looks better now than it ever did, Qualcomm wont move the needle in WoA sales this year, Intel and AMD do magnitudes more volume. So why would developers early adopt a platform that might have 2% market share by the end of the year? They wont. Same reason why Linux gets shunned too.
Developer support will obviously grow (it cant get any smaller...) but it will definitely take more than a year for the vast majority of productivity/general use applications to even consider native ARM builds.
Not the same situation. Apple made it clear to developers that M-chips was the only SoC going forward. So developers were forced to accept it as the future.
Many didn't though. Gaming is de facto dead on modern Apple platforms. And yet that same use case seems to be the justification to claim Qualcomm's chips suck and won't get any traction.
These chips have been out 5 years and that we aren't really seeing anything good is not a surprise. Intel and AMD will be 10x better in a couple years and Qualcomm will get to where they are today in a couple years? Why get Qualcomm then? They are expensive!
that we aren't really seeing anything good
Crushing AMD/Intel in battery life and efficiency doesn't count for anything?
Intel and AMD will be 10x better in a couple years
And yet they haven't. If anything, the gap has widened.
So at the end of the day, the Snapdragon has good battery life and nothing else good? You know an 11th gen Intel chip hit 18 hours of battery life according to another review.
the Snapdragon has good battery life and nothing else good? You know an 11th gen Intel chip hit 18 hours of battery life according to another review.
The Snapdragon has both battery life and performance well beyond an 11th gen Intel. For someone so obsessed, you clearly haven't actually been following reviews. What's your angle?
I don't have an angle. The very interesting review came up in my feed and I felt it would be interesting for the community.
Lmao, sure you don't.
This "very interesting review" isn't even a review. It's a blogspam rehash of another article.
Intel has had terrible efficiency/performance per watt for as long as I can remember.
There’s always been an alternative more efficient than Intel. For many years, it was PowerPC. Now it’s ARM.
I’m honestly surprised Apple switched to Intel in the first place. In hindsight, I think it was a mistake.
No, PowerPC sucked at the time.
In 2005, Apple was unhappy with the chips that IBM was putting out. Their laptops were stuck on the G4 because the G5 used too much power and ran too hot for a laptop. The G5 ran so hot that the Power Mac had to be liquid cooled.
Apple was getting frustrated that IBM wasn’t really making chips custom to suit their needs (ironically, they quickly had the exact same issue with Intel).
Also in 2005, PA Semi announced their PWRficient chip, which was their own custom PowerPC design. It was 2GHz dual-core and only used 8-14W of power, significantly less than both the G4 and G5s Apple was using, which only had 1 core.
Apple was in serious talks with PA Semi to use these chips. The problem was the chips weren’t going to be ready until 2007, which apparently was too long for Apple to wait, so they went with Intel instead. This apparently blindsided PA Semi, who was almost certain they’d win Apple’s business and were in advanced talks with them.
The problem was that Apple ended up with the exact same heat, battery life, and efficiency issues with Intel as they had with IBM’s chips. As early as 2008, they were already running into thermal issues with Intel.
The MacBook Air (especially the original model in 2008) ran extremely hot, was prone to overheating even with the fan going at full blast, and had poor battery life. None of the Intel MacBooks really performed amazingly, and the fan would ramp up to full blast any time you really strained the processor. The Intel Macs (even the desktops) were all pretty infamous for being thermally limited, and thermal throttling.
There was also the infamous 2013 Mac Pro, which was so thermally limited by Intel and AMD’s chips that they were unable to upgrade it. It also overheated easily.
By using off the shelf Intel and AMD chips that Windows PCs also used, Apple was unable to differentiate the Mac based on performance. It was basically “the same as a PC but it runs MacOS”. There was no more “twice as fast as a PC” like they used to be able to claim.
Ironically, Apple ended up buying PA Semi in 2008 to create the Apple Silicon ARM chips they use today.
If Apple had purchased PA Semi a few years earlier in 2005, they could’ve made their own custom PowerPC chips for Macs without needing to switch to Intel, then eventually switch to ARM in 2020 (or maybe even earlier).
The Macs during that time would’ve been more competitive than Intel in performance per watt, and wouldn’t have had any of the thermal or battery life issues the Intel Macs had.
Apple was in serious talks with PA Semi to use these chips. The problem was the chips weren’t going to be ready until 2007, which apparently was too long for Apple to wait, so they went with Intel instead
Intel had Conroe ready before then. "Better than G4/G5" was not good enough.
The MacBook Air (especially the original model in 2008) ran extremely hot, was prone to overheating even with the fan going at full blast, and had poor battery life. None of the Intel MacBooks really performed amazingly, and the fan would ramp up to full blast any time you really strained the processor.
Intel, and Haswell in particular, was easily best in class at the time. I don't think that can seriously be debated.
There was also the infamous 2013 Mac Pro, which was so thermally limited by Intel and AMD’s chips that they were unable to upgrade it. It also overheated easily.
Like many of the thermal issues you reference, this can be blamed on Apple's design. Any decent chip can thermal throttle if you starve it for cooling enough.
Intel had Conroe ready before then. "Better than G4/G5" was not good enough.
The PWRficient was better than both IBM and Intel's chips.
Better than the Core Duos and Core 2 Duos.
Dual-core 2GHz using just 8-14W. Literally perfect for their laptops. And they could've done an entire family of chips based on that design, as PA Semi was planning.
Apple was in advanced talks to use these chips. The only issue was the timing, they'd have to wait until 2007.
However, if Apple bought PA Semi in 2004-2005, I bet they could've accelerated that a bit and had the chips ready earlier in 2006.
PA Semi was a small startup with limited resources.
They were designing the family of chips specifically for Apple. After Apple rejected them, they cancelled the rest of the chips and only ended up releasing one.
Intel, and Haswell in particular, was easily best in class at the time.
Did you use the 2008 MacBook Air? It was basically defective.
It overheated at the slightest thing, even having a bunch of browser tabs open. Even with the fan blowing like a jet engine, the chip still hit 100ºC and thermal throttled.
My 2010 and 2020 Intel MacBook Airs did the same thing. Terrible.
Like many of the thermal issues you reference, this can be blamed on Apple's design.
This was the entire reason they switched chips, they wanted to be able to make any design they wanted.
Steve Jobs literally said on stage "We have amazing designs we don't know how to build with the PowerPC roadmap", which is why they switched to Intel... and had the same problems.
Obviously, trying to shoehorn off the shelf chips into their designs worked poorly, whether that was IBM or Intel's chips.
They wouldn't have had that issue if they had bought PA Semi earlier and were making custom PowerPC chips for Macs based on the PWRficient cores.
It'll get there in a year or so. Intel has come a long way quickly though.
No it wont. Intel 'only' had to improve their drivers and optimize for games, which was still a monumental task. Qualcomm, and any WoA SoC has to also deal with the awful PRISM emulation, or wait for developers to port games to ARM (zero reason to at this point).
This isnt going to take a year, it will take 5, as the problem isnt just Qualcomms drivers but all games being on x86 code. Even Macbooks still have a very rough time playing games, and Rosetta 2 is better than PRISM and Apple has better drivers. And as we've seen, developers have very little interest in supporting niche ecosystems. Valve and other developers infamously dropped MacOS support from their games because it wasnt worth it, most games still have no native Linux support either.
So this is going to take a long time to fix, if ever. Nobody should buy these devices to play games on, as they will be obsolete by the time that's possible.
Intel has millions of laptop running Xe graphics to gather data from, that's a big advantage Qualcomm doesn't have.
The competition is finally getting Intel motivated to put out better products.
The chip industry takes multiple years to develop products.
Alder Lake began design as soon as Zen 1 launched. Lunar Lake began design as soon as M1 launched.
And Snapdragon started around then, too. If folks were familiar with product discussions with Intel vs other partners they would have a better understanding of the monopolistic like behavior Intel is still exhibiting despite more options coming available.
But neither Intel nor AMD are sitting still. It's also pretty difficult to dedicate a lot of resources to this effort if you don't have the customer base.
Mac was easy because once Apple made the switch everyone had to follow, whether they liked it or not, 100% of new Macs being sold were on the ARM architecture. Qualcomm doesn't have such luxury.
Right, neither are sitting still. I'm just pointing out when people make statements like "Intel is finally waking up" or along those lines.
There are no real short term responses besides increasing clock speed. Lunar Lake and all of these efficiency focused products have been in development for years.
SDE I think, from a design perspective, is fine. I think looking at SDE laptops as final consumer products (and evaluating the software compatability experience), while very important, is a separate conversation that the technicals behind the chip. I also think SDE isn't good enough to deal with the compatability issues.
These issues genuinely make me think 20 years from now that PC gamers will still play on x86 while the general Windows user will be all using Windows on ARM laptops. There really isn’t a use case anymore for a desktop when a laptop can do the same thing and be plugged into multiple monitors. Which will lead to this divide with a “small but willing to pay lots of money” audience for PC gaming. And the student/white collar worker who will need a computer on the go that can use cables to connect as many monitors as they please. Unless by some engineering miracle Microsoft can figure out a solution that ain’t happening any time soon
Im getting tired of this subs biased opinions about these chips lol. It was funny and kinda legit at first because yeah the hype was kinda overblown but it’s just stupid now. It’s obvious that these chips are for ppl who have thinkpads and the like where the hype is the efficiency and office stuff. Why don’t the negative nancies cover the driver issues? Thats a legitimate concern. Or the google drive interoperability issues.
Who cares?
The snapdragon PC owners never use productivity apps, never play casual games
They only play video for the whole day in unplug and balance battery mode.
The snapdragon apps are great for productivity but not good for gaming.
I wouldn't even say they are great for productivity. There are issues on the productivity side too, and it's not like these chips are setting the world on fire in terms of performance either. Strix for instance should be significantly faster than Elite X.
Elite X may be good enough for light workloads and content consumption, but at that point they are a little more than a tablet. And it's way too expensive for a glorified tablet.
The vast majority of productivity work is done in browsers and office apps. Yeah it’s not great for video production but the vast majority of people who want a long lasting mobile productivity machine will never edit a video or do CAD work.
The vast majority of productivity work is done in browsers and office apps.
When someone talks about productivity tasks in context of PC hardware they talk about traditional apps which run locally on the computer.
Just because some administrative roles only require access to Google Docs. Doesn't mean there aren't a big demographic of folks who actually use their local compute for work.
vast majority of people
vast majority of people don't need anything more than a Chromebook. But here we're talking about premium $1000+ laptops. Which should excel at productivity as well.
My delusional take – with how good nvidia’s software is, they are going to release arm cpus soon with translation layer specifically oriented for gaming that actually going to perform well 🤡
Nvidia doesn't even have a CPU architecture team. So they would be using vanilla ARM cores. I think this gives them even less of a chance than Qualcomm.
This sub seems very focused on the poor gaming from these chips. What'll impact consumers a lot more are random driver issues around printers and peripherals, Google Drive issues, etc.