Intel's 8th gen CPUs in 2017? Still 14nm?


Not a masterplan, no. obviously they would have liked RT to be successful, for example. But it’s not something they’ve given up on. Unlike… all the other things that they’ve given up on. Windows on ARM isn’t over until they stop working on it. Until then, its one unbroken attempt. And they keep refining it. They keep getting closer to the only slightly painful Rosetta days for Apple.

No, the alternative is bringing actual desktop computing to mobile devices. Nobody else is doing it. Not because it’s not worth it. Because they can’t. Nobody else has such a strong desktop OS as Microsoft that they could port to the mobile world.

That desktop OS loses relevance day by day, though. while its fast-rising mobile competitors get ever more powerful. Is Windows even the most used OS in the world now? Serious question. Because Android’s gotta be up there.
If we’re going to talk about performance as the bottom line, and we shouldn’t, but if we are, know this: Legacy has always been a millstone around the neck of both Microsoft and Intel. And both of them have worked miracles with what they have - X86/64/WoW emulation, for example. But Apple and Google have less work to do. Their slope is not as steep.


I never said it was over. I just said it’s one of those flops. Didn’t say when it will fail, I’m not a clairvoyant :slight_smile

Actually Android isn’t the most used one either. Nor is any of the mainstream operating systems. The obvious winner is Linux, plain old Linux with really minimal kernel barely enough to function and very specific user interfaces with super limited functionality. Yeah, the one in your router, TV, maybe even fridge. But does that mean we should all run this sort of minimized Linux distributions on all our devices? I’ll leave that for you to answer to yourself.
Yes, Android is popular. But what’s the reason? Because Windows doesn’t run on phones. There are probably more smartphones than PCs in the world.

And this is where you’re wrong. First they have to build the apps, and the user experience. What Microsoft has is invaluable resources built over literally several decades. Keeping backwards compatibility is less work than rebuilding everything from scratch.


Oh, I’m not sure if it will be a success, the difference between me and you is, I think its success is of existential importance for Windows as a consumer operating system.

Well…!!! :laughing:

Look, we’re talking about devices that people surf the internet with, communicate with, bank with, consume content on, etc. What you generally think of as “personal computers”. It’s an analogous term, I know. It’s true that nobody is writing their CV on an Android phone (well, I’m sure a lot of people are, actually, but you know…)
I don’t see much of a difference between form factors, which is why I’m here, as a Hyper Early Bird backer of a device that could be called a laptop, or a tablet, or just “a computer”. It’s something that could theoretically be used for work, or for browsing the internet, or just for content consumption. We know that the PC market is shrinking, because, it turns out, a lot of people didn’t need all the stuff PCs did in the first place. But I suspect that of the PCs that ARE selling, ultra-mobile form-factor devices are the most popular (meaning, thin-and-light laptops, convertibles, 2-in-1s, as opposed to those 17 inch massive laptops people used to have just five or so years ago). So Microsoft knows that they need an operating system that is extremely efficient, extremely secure, fast to wake, always-connected, can fit in absolutely minuscule devices, etc. It has all these goals that legacy kind of interferes with. Intel wants to be able to provide all these things, but it has an ancient architecture with a ballooning instruction set that was first conceived in an era where CISC was the only way to go.
It’s. A problem.

In business, both are safe. To a degree. So is IBM’s Power series, though. Imagine a future where Windows is relegated to a role similar to Solaris back when. Microsoft imagines that world every day and is trying to prevent it from happening.

Neither Apple or Google have to build the apps. The hardest part is getting people to use their platforms. As long as they build a popular platform that is powerful enough to enable complex apps, developers will, eventually, come. imo. Neither started from scratch either, but that’s another story.
The popularity of iOS and Android can’t be denied. It’s not just that they’re widely-used, they’re also widely loved.
Meanwhile, the application frameworks and APIs have gotten increasingly complex with a pretty obvious end goal. The operating systems now support side-by-side multitasking, direct access to GPU hardware, multiple input types (but, crucially, not pointers on iOS, which I think is a huge mistake), etc.

Lets look at legacy apps though, of which I believe there are three kinds:

  1. The millions of industry-specific, or even company-specific applications that few people will ever hear of. They’re what make Windows such a dominant force. They’re also usually terribly coded, badly maintained, and probably ancient at this point. They never performed great and sticking them in a virtualised/emulated instance would actually be beneficial for a number of reasons (with security being the primary one). A lot of the other stuff is written in Java and is obviously platform agnostic for that reason. Even more is, and always will be, *nix based.
  2. Big professional applications, usually creative or business related. Stuff that Adobe and Autodesk make. Here I’ll remind you that fifteen years ago, Photoshop and Final Cut etc were said to be infinitely superior on PowerPC. There was a protracted period where Phtooshop on Mac was emulated via Roseta and suffered quite a large performance penalty for it. But really, these applications have existed in some form since the 80s and have jumped OSes and chip architectures and even now exist on at least two different operating systems. It isn’t Intel SIMD extensions that’s stopping them from moving into iOS or being selectively optimised for ARM. It’s the lack of an end-user machine that will utilise them (for instance, the lack of an iOS laptop with capable of running the pointer-centric user interface). Meanwhile, Affinity, which started on the Mac, understands that it needs to port itself to Windows and… iOS.
    Another note on this class of application: A lot of business software has successfully migrated to the cloud. The most obvious is Office 365, which lost VBA but nothing else as far as I can tell. Other apps were born in the cloud, and could only have existed there. Salesforce, for example. These will always be platform-agnostic.
  3. Games, and multimedia/consumer apps: The latter are easily replaced by alternatives. The former are, for desktop PCs, a niche. I think Microsoft should be a little wary of how much mindshare Steam has, as well, especially since Gabe seems to have a pathological, irrational hatred of Windows 8/10.

Generally speaking, legacy is ALL Windows has. Actually new, useful applications for the platform are almost non-existent. Sure, Windows can interface with any printer from like 2001 onwards. That’s great. But a lot of these smart devices that are popular these days have apps for iOS and Android and absolutely nothing for Windows. Which is interesting in itself, right?
The old stuff is going away. And the new stuff is being built with a mobile-first mindset.


It doesn’t have to be successful. In order for Windows to be a successful platform, mobile devices should run actual real Windows. Not a shitty emulated version.


what about going back to topic ?
this isn’t a x86 vs ARM thread.


Just popping in here to mention that Intel has spoken out against x86 emulation in ARM processors on grounds of intellectual property infringement. They might allow liscensing for it (or may be required to) but making x86 run on an ARM and letting it loose in the market isn’t as simple as just replicating the full instruction set on a RISC architecture (as complicated as that is already in a pseudo-hardware emulation design).


Since this thread has been revived, I will take the opportunity and remind you: quantum computers already exist and they have a very specific purpose - basically they try each possible solution and see which one completes faster than others. They’re not faster for our day to day computing tasks, and they probably will never get as small as desktop computers because they need really good isolation from the outside and temperature near absolute zero (I think I’ve read below 1K somewhere) for quantum mechanics to function.

And the future of computing, I think at least for the mid-short term it lies not in new hardware, but in optimized code. For example you probably couldn’t write a simple console app with modern tools that would run on a machine that originally ran an arcade racing game. Nowadays we’ve gotten far far away from Assembly, the first programming language, in order to simplify coding at the cost of efficiency. But we have Go for example, that is pretty much a new programming paradigm written specifically to take the good aspects of modern programming languages without sacrificing that much efficiency. The best part about is that it was written specifically for multithreaded processors, while most of the usual programming languages were only updated to support them, originating from single-thread languages. I think that can go a longer way in making computers faster than shrinking dyes at this point.


[Warning: this is a genuine question, not an expression of disagreement with a question mark tagged at the end]

But how easy is it to do that and still maintain backwards compatibility interoperability, and seemless data migration (between applications, as well as between them and OS)?


At some point that won’t be possible anymore. For mainstream users it will suck for an upgrade cycle or two, like when the iMac moved from PowerPC to Intel based cpu’s. Other areas like financial, medical industry for these it is important to have backwards compatibility, but they are also the ones that often have very special cut Software for their purposes.


All applications are translated into machine code in one way or another. In the end, all the communication between software happens in machine code level.
Just for example, during my internship I worked on a project that was written in 3 different languages. 4 if you count SQL. It really, really doesn’t matter what language you use, you can always come up with ways to make it compatible with whatever you want.


So i agree that mobile should run windows, but that’s not really possible… the design of x86 was flawed from the beginning and its become the hardware analog of spaghetti code - it wasn’t ever really designed to be power optimized. Just have a look at the assembly instruction set and that’ll give you an idea of the complexity of whats under the hood.

does this mean that ARM has an edge over x86? no. I will admit that ARM is architecturally more elegant and efficient, but the defacto standard for desktop computing will be x86.


sry but am very busy these days so I just wanna throw a few developments in here.
Well, I agree that a few laboratories were able to realize the very basic concept of quantum computing, but even these achievements were not there until recent years.
What I also meant was something you can bring to market.
As of now problem is, one can barely bring a considerable number of qubits (quantum bits) working together.
Like mentioned these materials (i.e. topolgical insulators) need isolation and temperatures that are far away from being practicable. But there are many attempts in research finding new chemical compounds working at room temperature without loosing these physical properties (insulating inside and conductive at the borders) - these materials must be affordable and one also has to be able to produce it in nameworthy quantities.
The existing systems with only a few qubits connected together can only handle very basic calculations - that can barely considered as useful.
Also there are a few companies out there claiming to have build real working quantum computers. But if one asks for some detailed information they suddenly keep silence.

As for new information technology. Current research in theoretical physics is the very interesting Spintronic (superposition of spinmoment states of the electron is used so that one can create imensely more states than just the known 0 and 1 in comuter science). This concept will need new ‘quantum informatics’, but I think it will not be a tremendous amount of adjustment for current computer scientists.
Another development in research which I find very interesting is from cooperation between researches of computer and genetic sciences; the idea of DNA-storage in order to storage even more massive-data more efficiently. This is possible because the (human-)DNA is much more robust and has a whole nother complexity than current computer chips. This could also be a big challenge as it would need some adaptions and new thinking in current coding systems.


I’ve both written x86 assembly code and passed an exam where I pretty much had to memorize the whole architecture of Intel 8088. So I can say I know at the very least the basics of x86. The most annoying thing with it is the limited number of registers - even for relatively simple operations you often have to move arguments back and forth between various registers and RAM. Some of the registers being designated to specific functionality doesn’t help either. So I know, it has its disadvantage, but when you think how much of the code running on our computers is high-level languages like Javascript or Python, you realize that optimizing that code can go a long way to make the user experience much more enjoyable with cheaper hardware, as well as expanding the limits of fast hardware. Nowadays developers don’t care that much about making their code efficient anymore, because they know they can just slap more cores/RAM/GHz/whatever they need. Of course there are limits to that, but from today’s perspective it’s fascinating just to look how well optimized legacy code can be. Nowadays people rarely if ever do that. But once we hit the wall (and Intel is already showing signs of that wall approaching), we will have to set back and re-learn to optimize our code. Just like in the 80s.
Of course, unless someone comes up with a new way of making processors.

Which brings us to ARM… It breaks backward compatibility, not even partially but completely. Whoever tries to make a desktop OS that works on ARM (I know Linux does, but it’s not popular and will never be) will face a serious “app gap” problem. I know, emulation is a thing, but it would take a lot of time to develop an ARM processor that can emulate x86 at native speed of a modern Intel Core processor. Way more than we can wait. So either we optimize our software, or we try something totally new. Rumor has it Intel is already exploring “new” semiconductors more efficient than Silicon, so maybe the future is in their hands after all. The question is how far that can go, I’m not good enough at physics to tell that.

I actually saw a video by LinusTechTips… They let him in. I don’t remember if there was actual proof of the thing actually working, but it was told they had some customers using it and the video looked quite convincing.

And how do you imagine that working out? In my understanding, a quantum computer would work sort of like a non-deterministic Turing machine. That is useful for many scientific problems that would take ages to solve using today’s computers, but for some problems it’s not useful at all. It might be very efficient for pathfinding algorithms for example, but for web browsing, on the other hand… Not really.


I’ve seen the same lol


I cant say I’ve memorized even that much - ive only extensively used the 8051 in asm, but my point is that, with every new ‘feature’ in x86 land, they add another handful of instructions and so on for the past 30+ years. Great for backwards compatibility as you noted, but terrible for any sort of architectural optimizations.


Oh, yeah that too. But that’s the disadvantage of every CISC architecture. It would really make sense to use RISC only nowadays and forget about CISC, because we don’t really write code in assembly anymore, but backwards compatibility strikes again… Seriously, if we had to just start over from scratch, the choices would be very much different now, because we live in a different context. Maybe if ARM came up with a really really powerful consumer-oriented processor and someone wrote a desktop OS for it (Linux doesn’t work here because it’s not consumer-oriented by design and there are way too many apps that are not user friendly at all), I think it might find its way between PC and Mac. But it has to be totally new, it shouldn’t drag any backwards compatibility burden with it. Even the software part. And that’s not likely… it’s not so easy to write a new OS just like that. It could be a Linux distribution, but then it would need to be as far as possible from traditional distros. Like Android. It needs to earn its own name and a totally different attitude from developers.

But as I said, this isn’t gonna happen anytime soon. And by the way, ARM is also approaching 10nm. While they do have more headroom for improvement by optimization, they’re still approaching the same hard wall as Intel…


It seems the newer laptops are benefiting from much stroner benchmarks AND better better battery life.

This is a dell review, but lenovo 920 had similar resuts.