> With a Snapdragon inside your PC, you'll no longer need Wi-Fi to fetch your latest e-mail and catch up on Twitter. Instead, you'll be able to get online wherever there's cellular connectivity.
I'm not buying the optimism, especially not in the US. The cellular data is still horribly limited. In Canada it is even worse. The "unlimited" plan is surging back, but it has automatic throttling at varying points.
Oh and since we're talking Microsoft and Windows, you can't control data usage. Windows 10 uploads your personal information at unknown intervals and downloads ads and sponsored apps without consent.
What good is faster cell data with horrible data limits and an OS with a poor concept of when data is permissible even in the "Enterprise" edition?
Three years ago, our local Spanish operator Movistar deployed the 4G mobile network and started offering an impressive 75 Mbps download speed and 25 Mbps upload speed.
At that point, the largest data plan you could get from them was 1 Gb per month (now I think you can get 2 or 3 Gb, not more).
This meant that if you used the connection to its capacity, you would hit the monthly cap in around 1 minute 50 seconds. With the current data plans it may last you 5 minutes or so, not more. With other operators you may get up to 8 Gb, i.e., around 15 minutes.
With the current pricing, this is the most useless technology ever. I don't know why they even bother (well, it suppose it probably has some advantages at their end as well).
For most people once their connection is fast enough that any given thing they try to do is not noticeably annoying due to slow download speed, their download speed and their monthly data usage are largely independent.
For example, over the 12-17 years I've had Comcast they increased my download speed from 8 mb/sec to 20 mb/sec, then to somewhere around 40 mb/sec, then to 60 mb/sec, then to 90 mb/sec, and now it is at 120 mb/sec.
I don't remember what my monthly usage was back when it was at 8 mb/sec, but from 40 mb/sec onward, and I think maybe at 20 mb/sec, my monthly usage has stayed about the same, varying between 20 to 40 GB/month.
I've always appreciated the faster speed, even though I don't use it to download more. I appreciate it because it means it takes less time to download. For instance, if I'm working at home and realize I need a big file from work to continue it might be 5 minutes to download at 120 mb/sec. It would have been 30 minutes at 20 mb/sec. The former means I can grab it while I take a bathroom break and a refill on my water, with almost no disruption to my work day. The later means a big disruption.
Around here they started offering unlimited uncapped traffic (up to 1 TB) - BUT - after you go over 5GB daily your connection is disabled and you have to send an SMS to enable extra 2GB - you can do this unlimited ammount of time (up to theoretically 1TB).
So this makes streaming, torrenting and stuff awkward but for daily internet use it's a decent option (it's a relatively inexpensive plan) - for eg. I could work remotely on this plan.
Not sure how this is worse than what Android or iPhone do. My apps are constantly chatting on the internet and I have zero visibility over it and any attempt to control it involves a lot of behind the scenes controls not obvious to the end user.
E.g., Dropbox can only synchronize the camera roll when you keep the app open or enable the location updates permission. Also, on iOS, you can disable cellular data for specific applications.
This is exactly why you can't get a decent photo uploader on iOS to upload in the background. The existing ones either stop after a few because iOS puts it to sleep (or whatever the term is) or they have to use location services to stay awake longer, but that's a gross hack.
I get why Apple does this. I just wish they would let me choose to have an app run as a service. Being able to sync all my photos to NextCloud while running in the background (and not divulging my location) would be great.
Developers will abuse this ability and average iPhone will be filled with apps constantly uploading usage data and downloading meaningless updates, significantly reducing battery life and performance.
Apple should just have some kind of API for background uploads. I know, that API for background HTTP downloads exists.
No. There are many other cases where "background upload" isn't enough. Like Fitness Tracker apps, that want to sync your Tracker with iOS Health in the background. There must be a more general solution.
My Garmin sports watch syncs via Bluetooth and uploads to their website in the background. What more do they need beyond this for your application? Is there a particular limitation that it can't update Apple Health in the same background operation?
You're right. It limits, but not stops, updates on metered connections. Say you have a device that is on metered only for weeks or months, Windows will rightfully start downloading updates as not having these updates is a security risk. I think MS has made a lot of controversial decisions with Windows 10 but I don't believe consumers should have the ability to dismiss security updates out of convenience. The risk to themselves and the larger internet is far too great.
Also, putting aside the politics of telemetry data, the actual file is a compressed XML file. I don't know the real size off hand, but its just compressed text. Its not going to affect your metered connections quota. A 20kb transfer every month shouldn't bankrupt your connection.
Not to mention, its trivial to disable telemetry if you have basic technical chops. Windows updates too. Or don't use Windows if your use case is so far from the norm. Its designed for the lowest common denominator, not for techies.
Hasn't this market for small/light/cheap "net PCs" been proven out by Chromebooks though?
I know this isn't exactly the same thing, and the baked-in LTE is an interesting feature, but I feel like this is Microsoft going after that market specifically..
I know that if the price point was low enough, I'd get one of these as a travel machine, especially if I can get it onto my "share everything" data plan and split the usage between it and my phone ..
I already tether my Macbook Pro to my phone when I'm travelling and need on-the-go internet anyways, so it would amount to the same thing.
You've been able to limit data usage on Windows 10, if not from the start, shortly thereafter.
"Set as metered connection" under the connection settings.
You can disable telemetry at any point if you want to.
The only "unwanted ad" - that everyone constantly complains about, is for onedrive. Guess what - every time I log into my mac, it asks me if I want more icloud storage. So literally the same thing. Ubuntu wasn't much better with their Amazon experience. If you're going to tell me to install Gentoo or Arch on my daily driven laptop we aren't having a rational discussion.
I've NEVER had Windows download a "sponsored app without my consent". What third party app is this you speak of?
I've had ads on Windows lock screen ("Get fun facts, tips, tricks and more on your lock screen"). I was fairly sure I had turned this off, and it was weeks before I saw an ad (replacing Spotlight background), but once I did see an ad, I wrote my own program to set the lock screen background and stayed well away from the MS Spotlight.
Spotlight itself is an ad for Edge; if you want more information on the photo, it'll pop up in Edge, not in your default browser.
Windows installed a bunch of sponsored apps / icons when I first installed it. Xbox, Skype, Groove, Money - they may all be owned by MS, but they're still crapware as far as I'm concerned.
One of the most despised things, for me? In Edge, Cortana popping up, when I go to the driver download site for my laptop, "Can I interest you in a coupon?".
Ubuntu no longer has the Amazon suggestions, they listened to the users, and Arch and Gentoo aren't the only alternatives to Ubuntu; Fedora, Mint, Elementary, etc are all fine and as easy to use as Ubuntu.
Why be so obviously disingenuous? Why make excuses for companies to screw your software?
I switched to Linux a long time ago and every time I'm forced to use a proprietary operating system I get reminded that I made the right decision.
I doubt any serious windows users support adware and privacy violations, only the people who invest in Microsoft.
Notice they said, email and twitter. Both are things that don't use much data. They already know that AT&T and Verizon will choke the crap out of this with their false limitations.
Yeah, I'm not really buying the LTE being the "killer feature" thing either. I hope Qualcomm is not betting the farm on that. Its ARM processors will have to stand on their own feet in regards to performance alone. Few people buy their laptops based on LTE support, and although I do believe it could become more popular in the future, I don't think it will have a fast pick-up, with or without Qualcomm's promotion of LTE laptops.
I definitely think it will be popular in the future. It depends on who wins the race for ubiquitous network coverage, though.
Google, Facebook, and SpaceX have all moved to initiate programs to install global high speed wifi access.
This is probably the last edge chance for cellular network companies in the long term. If wifi hits first, with projects like Google home offering free calling as well as other ip-phone services I think we could see those companies begin to fall.
I'm taking the rather long view, though.
For average, non-power users, I could see these being quite popular -- especially if they can handle Windows 10, MS Office, some light Photoshop, and fast internet. Otherwise I wouldn't mind one to turn into a highly portable Linux machine -- stars in my eyes, and all that.
> Otherwise I wouldn't mind one to turn into a highly portable Linux machine -- stars in my eyes, and all that.
Assuming you'll be able to boot Linux on such a device, that might still require exploiting some yet-to-be-found bugs to root it, and/or voiding its warranty. BTW good luck with GPU drivers, multitouch, exotic sensors...
Is tethering too complicated for most people? Simpler tethering seems like it would be more helpful here than a new CPU architecture that also requires people to have a second SIM they need to get on their plan.
I'd love to tether, but don't because my carrier (AT&T) won't enable it for a single device, only for all accounts on my family plan. My family could easily (even innocently) use up our monthly data allotment in a day.
Also, how can you limit data usage while tethering? Can a device tell if it's tethered, or does it just assume it has a normal wifi connection? Do you need to turn off automatic updates on laptops?
The short answer is yes Windows typically does know it is on a metered connection and doesn't do anything like update downloading. If it doesn't automatically detect it you can also mark that connection as being a metered connection and it will remember it. I'm not sure about tethering but when using built-in LTE you can even set what your plan limit is.
Not every modem can be marked as a metered connection. I'm using one such modem right now. And Windows decides at whim to ignore metered connection settings.
None of this is solved by having a modem in your computer though.
Though Windows 10 does let you mark a connection as metered, though I don't think anything except the OS pays attention to that. But again, that's orthogonal to whether your laptop has a modem and will likely not be solved.
For most users, I would say yes. Of course they want it to be as seamless and simple as possible to connect to information.
Tethering is hardly a second thought for most people who are comfortable with the technology, but you're forgetting most of the rest of the world here, as well as a lot of people outside the tech realm. Never mind people who currently live outside of high-speed wired internet zones -- with cell coverage reaching further than wired zones lately I'd consider the option a boon to connectivity.
I fail to see the real downside to the public. Maybe you can elucidate further.
Consider how annoyed people have been at the Apple dongle situation. Sure they'll grunt through it but they want something better and less complicated. To the HN crowd, a phone tied to a computer to access all of the world's information is a large bit of genius (I used to dream about it when I was a kid... before everybody had cellphones), but to most users its just another thing dangling about that they have to connect and disconnect and remember to shut off tether mode and so on and so on.
If its in one package, and its as easy as hitting "connect", I think it will sell. That and consider cottage-goers who dislike disconnecting, or people who live outside of highspeed zones. Or even the connectivity-paranoid (script kiddies aren't so likely to start hijacking cellular connections as they are to bite you through an insecure coffeshop wifi). Doctors Without Borders might benefit, and other like travelling charity groups. I'm just spitballing, but I can see cases.
Though, a lot of my speculation depends on price points.
> Consider how annoyed people have been at the Apple dongle situation.
Do you mean the USB-C with the gazillion required adapters or new cables?
It's relevant you bring up Apple, because if you have a MacBook and an iPhone, and pair it via Bluetooth, the a tethering option shows up in your WiFi menu that you can easily select, same as any other WiFi network.
The advantage of it being an easy, yet manual process is that you can't accidentally use it and blow through your data. I do wish they would go the extra step and build "limit what you do because you're on a metered connection" support into macOS.
Good point. I really enjoy how easy it is on a mac when I'm on the train or something.
The difficulty for average users is the knowledge that you can tether, setting it up, and remembering to turn it off. The UI's are good at reminding you, but even I've forgotten if I've had to run off the train because I got caught up in work.
No, it's just too clunky in this day of instant gratification/”almost every application expects network access”/”computer is a communication tool not (only) an information-processing device”.
These Windows 10 laptops are using a Snapdragon 835, which is a smartphone chip that has just started appearing in smartphones. It's just a bonus business for Qualcomm.
Whether they take off or not will depend on many factors besides LTE. Those include size, weight, battery life, performance, Win32 compatibility, marketing, advertising, availability and price.
I look forward to learning more about this product. I love that we're moving non-x86 into mainstream consumer products like this (and not limited to low-end chromebooks, e.g.).
ARM SoCs lack the backward compatible x86 BIOS-style bootstrap. This is great because it's kinda crazy to think about booting into stuff like MBR, real mode, etc in 2017. But this is bad because without good standards it's really tricky to have a generic installer for another OS. Also, all or nearly all ARM SoCs have signed bootloaders. Great: less fear of a rootkit/virus hiding in my system. Awful: often used to protect subsidies or other business model hijinks; can't install an alternate OS.
>Awful: often used to protect subsidies or other business model hijinks; can't install an alternate OS.
And there's the rub. Microsoft forbids manufacturers of ARM-based Windows devices from allowing customers to disable or customize secure boot.[0] I doubt they'll change that going forward.
>On an ARM system, it is forbidden to enable Custom Mode. Only Standard Mode may be enabled.
No, this shows why (forcing) Secure Boot was a bad idea. Allowing Shim to load unsigned kernels would be equivalent to having it bypass SB in the first place, which Microsoft would presumably never sign.
Yeah, that's my biggest fear for this computer. I would snap up two or three in a heartbeat if I could put my favorite Linux distro on it.
My little 10" ARM chromebook has already left my more powerful laptop gathering dust. It seems like 'just a browser' at first glance, but it becomes so much more when you add a chroot which can launch X sessions into Chrome tabs. These Microsoft ARM machines are encouraging, since they should bring new entries into the ultraportable market, but I think I'll hold off until it's clear that we'll have options with regards to the software.
There's nothing to stop any company from making its own laptops with ARM processors, designing them to run Linux, and pre-installing Linux. You could do it yourself. Maybe this is your chance to become rich and famous.
Not sure why your first thought is to ride on the back of Microsoft's work...
> My little 10" ARM chromebook has already left my more powerful laptop gathering dust. It seems like 'just a browser' at first glance, but it becomes so much more when you add a chroot which can launch X sessions into Chrome tabs.
So you claim running X sessions in Chrome tabs is better than running X directly?
You could run a WM in a chrome tab, but I prefer breaking out individual programs. Like, I can have KiCAD running in one tab, with the datasheets for parts I'm using in nearby tabs. Or I can just break an IDE out into its own window. It's a small screen, so having the whole browser window dedicated to the program is nice, and you can even use fullscreen mode and navigate with ctrl+[shift+]tab.
ChromeOS has a nice UI. Certainly much nicer than XFCE running in a browser tab.
> Like, I can have KiCAD running in one tab, with the datasheets for parts I'm using in nearby tabs.
To me this rather sounds like one should use another window manager than the default one that usual GNU/Linux distributions install if one has a small screen.
I'm no GNU/Linux wizard, but I somehow expect that such a window manager already exists.
In fact, ChromeOs acts as a Window Manager, because it deals with how each application is organized in the screen/ shown to the user; in this setup, no Chrome Apps are used (only programs from chroot are used)
tl;dr: It can be thought as a WM, although it isn't one
ChromeOS is an OS. It includes a window manager "Ash". Saying ChromeOS to imply its own native window manager (rather than a generic Linux one running in a chroot on ChromOS) makes sense in context. Or at least enough sense that your question seems more mocking than clarifying.
I mean, there are tiling window managers that are happy to tab windows; I use i3 and do this all the time. Bonus: tiling window managers are a great use of limited screen real estate even when showing multiple windows.
But you know what you really want. Full fledged linux experience with all the drivers backed by Google for 12 hr battery life. ChromeOS isn't even scratching the surface of how linux can maximize use of the hardware. I mean you expect to actually do productive work on tabs that load javascripts on an ARM machine?
This is lunacy. It will break compatibility with nearly all software out there. From games to productivity apps. Like the stuff I actually use a PC for.
As somebody that used a DEC Alpha running Windows NT and x86 applications thanks to FX!32 code translation back in the late 1990s, I think you're underestimating the usability of the code translation solution demoed by Microsoft et al. Sure, it may be Win32-only (at least for now) but if and when these platforms take off more UWP (sp?) applications will be released to take advantage of native performance. This is likely a very viable solution.
Productivity apps are almost never developed in assembly language. They are developed in some high level language, so for each of such language, assuming the Windows 10 APIs are the same (which they WILL be, no need to change them):
- Java
Java applications will not require any recompilations, they will run immediately because they run on a JVM
- C#/VB/F#
Those applications still will run right away because they run in the .NET CLR virtual machine.
- C++, C
These applications, considering they are already using the Windows APIs, should recompile with few changes.
Games, nowadays, are also developed using frameworks. They mostly depend on the GPU nowadays, and the GPU was never x86 architecture either, so no big change there.
I believe Java and C# enforce specific byte orders to get around this issue, so I doubt you'll see a problem there unless interacting with C++ libraries.
Seems like a rather poor decision on Microsoft's part. Fortunately rather few things interact with data this way anyways, most serialization formats are text based or include endian information.
Lunacy? Nah, the vast majority of software that regular people use is massively I/O bound. For the stuff that isn't (photoshop, e.g.), it will get ported. All of those non-photoshop apps will work just fine under emulation. Chrome and Edge will be ported, they'll work just fine.
I predict that this will have a negligible impact on PC games. They're CPU/memory/GPU bound and will not work well under emulation. They're not likely to get ported. But gamers are not the target market for this product.
In the end it's an hen and egg problem -- you need widespread adoption of the hardware to get software ported, but also widespread porting of the software to get the hardware adopted.
And intel/amd will fight hard. As long as they can instill v fear that maybe the program I'm using, or the one I want to use next week/month/year won't run, people will flog to x86. It doesn't matter so much how real that problem is, it only needs to be perceived as a problem.
Virtually all modern software vendors already write software for multiple platforms and are leaning more on cloud computing, so porting software to a new platform won't be the nightmare it used to be.
Adoption will depend on the killer features, which is currently "LTE" and "uses less power", neither of which PC users care about, so that's why this is doomed to fail.
a) only 32bit
b) At what speed though? x86 emulation isn't trivial, and arm seems to be sufficiently different in architecture to incur a large translation overhead.
Microsoft has had videos out for a while of their x86 emulation. They have a demo of loading Photoshop CC, and it looks close to native speed, though I notice a bit of window refresh. Looks usable, but I'd like to see more demos.
Not only that, but if you are making UWP packages (for the windows store? I'm not sure if they are standalone as well), you can ship both x86 and ARM binaries together. I'm probably butchering the details, but the presentation and accompanying HN discussion was interesting[1]. The video is pretty cool, as they demo real running x86 and ARM binaries on the same system, and explain some seriously cool stuff WRT x86 emulation.
Qualcomm is decidedly against open source and open firmware, this will bring the closed disposable worthless logic board with the unmodifiable OS from your phone and into your laptop.
Microsoft likes it of course because it means more control for them.
TL;DR: Intel is not being abandoned because it's too closed, it's being abandoned because it's too open!
My thoughts exactly. Wintel PCs are becoming too closed for comfort (see Intel ME, trusted/secure boot, etc.) and slowly degrading in terms of backward-compatibility, but they still have that legacy of openness inherited from the original IBM PC/AT of 1984 and continued throughout the 90s.
I wouldn't even want to call these "PCs" anymore. These are essentially locked-down mobile devices running a crude approximation of a real PC environment.
I disagree. I think this is about Microsoft and PC vendors wanting to take advantage of the economies of scale in the mobile phone market. ARM devices are higher volume than anything else. When the PC was the volume platform it allowed Intel to move into other markets because of the same type of economics.
Not everything is about that one issue. I think in this case it's typical capitalist drive to lower costs of production. Getting to move the PC user base over to subscription services like mobile phones is seen as a way to build revenue in what investors see as a stagnant PC market.
Not saying that all of that is good for consumers but, in 2017, thats not shocking.
Asus, HP, and Lenovo are all planning to introduce Snapdragon Mobile PC systems
at some unspecified time in the future, for some unspecified price.
In other news, today at the Vaporware Expo every computer manufacturer announced plans to release a new device featuring unspecified hardware at some unspecified time in the future for an unspecified price.
It really is just a threat though. These companies use the appearance of a competitor to extract cheaper pricing from their suppliers. If anyone is expecting better than Netbook performance, they're probably out of their mind.
Yes, but how many people really need better than netbook performance? Most knowledge workers only need email, excel and web browser. SaaS is eating desktop. Modern smartphone is capable of 3d games and even VR.
Dropping prices temporarily until a newcomer is strangled is something we've heard of in the past, but this time the competitor has a huge and growing mobile market, so the threat and lower prices are likely to stay around for a long time.
This actually seems to be more aimed at hurting the ipad for business than hurting intel. You can write your business apps in .Net and roll them out to users who are currently using ios and save pain and cost from using tools like Xamarin to try to use your .Net team to write ios apps with c#.
I think you were thinking of microarchitecture, microcode is related but probably not what parent meant.
Information about Intel internal architectures is highly confidential, but there are remotely related technologies that are generic and public, for example
https://en.m.wikipedia.org/wiki/Register_renaming
Nah, Tomasulo and all that jazz (microarchitecture) is about improving performance once you've got a defined instruction set.
Microcode (which is one of those highly confidential things that both Intel and AMD hold close and dear) is about doing an on-the-fly CISC to RISC transformation because you realized that the legacy x86 ISA is an absolute pain to handle (but you aren't willing to give it up).
I believe you are confusing microcode (which is mainly used for complex and slow instructions and does not need to be RISC) with micro-ops which is the Intel lingo for the internal RISC operations.
Register renaming is a key part of high performance ISA emulation. The x64 has 16 gp registers but the internal RISC normally has 80. Of course this is also a key part of OOO and alike as you mentioned.
I'm really excited about ARM laptops, but only if:
* They have laptop-class performance, not tablet class performance
* They can have more than 4GB of RAM
* You can put reasonable sized standard SATA SSDs in them, or user-replaceable M.2 drives.
* You can easily wipe Windows off them and put Linux on it.
> You can easily wipe Windows off them and put Linux on it.
None of the modems are going to be supported in Linux because they all use blob drivers and are wholly proprietary.
ARM has no bios or standard boot procedure. Google requires coreboot on Chromebooks, for example, which allows Linux to run on the ARM targets. It is unlikely ARM laptops without Google's strong arm are going to support coreboot.
Qualcomm specifically leaves significant portions of the support code for their SoCs in binary blobs, unable to be used in generic kernels. There is no Linux kernel level support for almost any Qualcomm SoC, and even then they are often some of the better ones. The other "real" options are third parties with ARM's own CPUs on them, which would have ARM's GPUs which have no real device support in Linux (Lima is dead) compared to Qualcomm parts (Freedreno sees updates). Nvidias chips are probably the closest to usable, and Samsung's Exynos are completely unusable and mega-proprietary (you will have a hard time finding even custom Android ROMs for Samsung SoCs).
The ARM ecosystem as a whole is really, really toxic. It is all NIH'd out the butt, there are no standards, everything is proprietary, and nobody contributes upstream. By comparison, x86 is an angelic fantasy.
Why wouldn't they have more than 4 GB of RAM? Current Android flagships are shipping with 6 GB of RAM, and 8 GB phones have been announced.
I'm not sure about the laptop-class performance requirement, you mean for light browsing and office tasks, they should be there already (even with the emulation penalty).
Lots of Chromebooks don't come with more than 4GB of RAM, theres no reason why they couldn't have more too. But the mostly don't because the target market is not seen as needing it.
There is no emulation in the traditional sense involved in running 32 bit x86 code on x86-64, the code is run directly on the CPU, only thing that needs to be "emulated" is the OS's ABI for system calls.
How would hardware emulation even work? Wouldn't it mean Qualcomm would have to support the x86 instructions on its ARM chip?
You may be thinking of how the Android emulator supports hardware acceleration for emulation, but that's just because your laptop has an x86 chip in it and the emulator is actually serving you the x86 image of Android, not the ARM one, so the "emulation" is more like Apple's "native simulation" for iOS.
Nvidia makes an architecture (Denver [1]) that is made by former Transmeta people and is in many ways a direct successor to this chip. In its current forms, Denver is implemented to execute ARMv8 instructions, but it could in theory execute x86 just the same. Now it's performing its internal exotic VLIW instructions by performing ARM "hardware emulation", but it could do x86 "hardware emulation" just the same.
Qualcomm uses a less exotic architecture in their chips so that's not an option for them.
This is way over my head, but IIUC practically this would be both an arm CPU and an x86 CPU at the same time? Can it switch "on the fly" in a practical manner, and if so what kind of context switch are we talking about? Would it be viable to multitask between the two archs?
Yeah, it's still a 7-wide VLIW like the Efficeon but it also has a unit to run native arm code in hardware emulation at a lower IPC for code that hasn't been translated. The idea is to run code that's only used once in hardware emulation but to recompile all the hot code to native VLIW execution.
There are set-top boxes, from ISPs that are also mobile operators, which embed femtocells. When a mobile subscriber is using such a femtocell, it doesn't count towards the mobile data cap.
Although of limited range, the femtocell is public, not limited to the landline customer's use. This handily covers shadowed areas in cities as well as remote areas, reduces load on main cells, and Tx power needed both ways.
I see this as an attack on ipad for business. I worked for a company that had to support ipads for business, we were using Xamarin and it was a slow and painful, expensive process but the systems were easy to manage, although using an onscreen keyboard was limiting. Imagine the same benefits of ipad for business without having to license a private app store from Apple. Imagine the cost per unit is lower and you can use .Net without the pain of Xamarin involved to roll out software as regular win32 binaries instead of jumping through hoops to produce an ios app in a .Net based business.
Does anyone else get the cold sweats just thinking about a Windows machine sitting essentially naked on the internet? Sure, Windows has a firewall. I still wouldn't really trust there are no zero days in it. This will make a damn fine Chromebook or Linux machine. Windows? Particularly with Qualcomm writing the unauditable radio stack? (See Google's Project Zero for why that's a bad idea.) No thanks.
I think this is GREAT news, because it implies the following.
a. - Intel and AMD losing their pseudo-monoply on PC/laptop chips, perhaps enabling bringing some prices down.
b. - Mainstream PCs abandoning the old x86 (+IA64, AMD64) instruction set for a more modern instruction set which (i guess) enables more modern, more power-efficient architectures.
but much more exciting than the above is that...
c. - The jump to a more power-efficient CPU will enable me to have a nice desktop PC with, say, 128 cores, and with no extravagant cooling system or ridiculously high power consumption (I need to pay the power bills, anything that will be on for at least 8 hours a day will be watched for power consumption.). I know that there are already ways to take advantage of GPUs, but they are for specific operations. A desktop PC with a huge lot of cores will bring more generalized performance improvement.
And TRUE hardcore multitasking...
d. - This will push all vendors and Open Source projects to improve on their compiler support for that architecture.
I love the concept -- I already use a laptop with an LTE modem, but I really wish Qualcomm would lay off on giving their baseband such low level access to the application cores.
I already begrudgingly use a phone where I know the carrier can execute arbitrary code accessing system memory, I really don't want that on my laptop too.
WiFi is to cellular connectivity what local storage is to the cloud. Apparently it's all fine and dandy, then one day someone on the other side change their mind and you lose this or that feature, in other words control.
Cellular connectivity can be useful as cloud storage can be, but I'd never ever use anything that depends on either for connectivity or storage.
Moving every resource behind a tap the user cannot control is not the way to go, although I'm sure IT companies will keep pushing in that direction for obvious reasons.
If SD835 is put in PCs the performance is going to be abysmal due to x86 emulation layer. Consequently, the price of the devices is going to be low, most likely in the same range as low end Pentium and Atom laptops/convertibles.
The same chip with lower performance (due to power considerations) in high end phones costing upwards of $600-900, makes it an interesting gambit.
Wow, This is what I was talking about when I got down voted two or 3 days ago. Yes, ARM/Qualcomm in no position to compete with INTEL high-end desktop CPUs. But they are perfectly capable of competing in low-end market and multi-processor environment. This will hurt INTEL in long term very badly. ARM will come to the desktop. No matter we like it or not.
If you are part of the cult-like group that dreams of a future Surface Phone, which includes myself, this is an interesting milestone. If such a device has been in the works in the past, Intel's cancellation of Morganfield (Broxton SoC for phones) was a huge set-back.
Microsoft pursuing x86 emulation on ARM, while certainly of interest to low-cost traditional form-factors such as laptops and tablets, is also interesting because it reinvigorates the idea of a Surface Phone. Of course, the reality remains to be seen.
For those unfamiliar, the concept that many of us imagine for a Surface Phone includes an evolution of today's Windows 10 Mobile Continuum feature that, as with Ubuntu Phones and some fringe Android devices, allows a phone to act as a full computer when docked to a keyboard, mouse, and display. While Continuum exists today on devices such as the Lumia 950 and HP Elite x3, it is limited to Windows' "Universal" applications, and does not work with x86 or even older Windows Phone applications. Still, the appeal of a single-device lifestyle is considerable and many people would love to simplify the technology in their life to a phone that is also their desktop/mobile computer. Many of the youngest generation already do this by simply conceding (willingly or unknowingly) they will never enjoy the advantages of computing with large monitors, keyboards, and mice.
In order for a single device to successfully fulfill these multiple roles, it needs to be able to run real applications and not just in a "enlarged phone OS" fashion seen on devices such as the iPad Pro.
If a Surface Phone were to materialize, I would expect Windows 10 Mobile and Windows 10 to become more converged than they are today. When using Continuum on a Lumia 950, Windows 10 Mobile is doing a respectable job of impersonating a Windows 10 desktop; but it's a veneer. For a Surface Phone to be a viable computer, I think it would start with Windows 10 (full version for ARM) and adapt downward to the phone form-factor, rather than today's opposite of a phone-tailored operating system adapting upward. Yes, W10 and W10M are very similar today, but a SP device would, I feel, push them closer together.
Of course, there is no genuine credible news on such as device, and Microsoft has dodged questions, even downplaying the likelihood of such a device. There seems to be a widespread opinion that one should not even try to build momentum in today's phone market, which I feel is shockingly myopic and pessimistic. Of course it's not a simple matter to establish a foothold in mobile, but previous efforts, while half-hearted at times were nevertheless making (slow) progress until Microsoft pulled the rug out from beneath Windows Phone 8. The biggest challenge to a Surface Phone would be bigger than technology: it would be a test of Microsoft's commitment to the idea.
I was very enthusiastic about both Microsoft Phone and Ubuntu Phone, I still am. I'd love to replace all my devices with a single below 7-8" phone that could run x86 apps at the performance level of Macbook Air 2013. Since most of my dev work goes on in Intellij IDEs and I use cloud for more demanding machine learning tasks, I don't really need much performance from the workstation to merit having a dedicated notebook/pc for it.
If you think...I would buy one of these except that it has secure boot mandatory so I can't install Linux or reinstall Windows...then you actually aren't the market for these. Companies and consumers who don't care are...and they will buy a lot of them.
Interesting point of convergence. This is probably first really mass market mobile VR chipset for positional tracking. Unless we turn out to need some kind of dedicated CV chip.
I hope in near future we'll choose what we want to install on our mobile devices and use it as desktop as well. Think of it as way better version then Samsung Dex.
Fantasic, except in the centre of one of the most tech cities in the world, London, I can still barely scrape 200Kb/s with Three outside Liverpool Street.
nah, just give me the x86 win32 emulation on Android so i can ditch my laptop for mobile which already has more RAM, but i still need to keep laptop for one work app
There's a couple different things at play this time. Windows RT only allowed Store apps and Microsoft's own ARM desktop apps. Windows 10 on ARM will run UWP Store apps and x86 apps (desktop & hybrid "Centennial" apps). At at Build, they announced Edge support for Progressive Web Apps and they'll be adding PWAs to the Store automatically if they meet all the right criteria for features & quality.
So not only will this platform support almost all the existing Windows software out there, it'll treat PWAs as first-class citizens.
Have there been any public previews of this? I'm still trying to figure out how they can emulate x86 without significant performance problems due to the different memory concurrency models
We don't know yet. Overhead could be something like 5% or 25%. We'll see when the reviews come. For what it's worth, it seems to run x86 apps just fine in Microsoft's own demo (keep in mind it's still a mobile ARM processor, so performance will be similar to Atom/Celeron/Pentium, rather than Core i5, etc).
I don't trust product demos, movie trailers nor press releases, so I'll reserve my enthusiasm until someone on the "real world" has had the chance to get their hands on this.
What I know is this: ARM emulation via QEMU on my core i7 was pretty slow last I tried it. Not unusably slow, but still slow. And I have tons of ram and a fat CPU.
These laptops, especially if they are made on the cheap side, won't have tons of RAM and a fat CPU. Unless Microsoft pulled some magic tricks, I don't see this emulation being usable.
"faster than QEMU" isn't a particularly high bar to clear, though. In particular if you're only emulating a single process rather than a full guest OS then you can reduce the overhead quite a bit. Plus if you design something from the start for the single-process emulation case, and only need to worry about one guest and one host architecture, and focus on emulation performance as a key goal, you can likely do better than QEMU without too much trouble.
Emulation is slower but most people will be using browser/office/vscode etc so its just for those extra few apps that might stop you buying the product if emulation didn't exist.
I'm not buying the optimism, especially not in the US. The cellular data is still horribly limited. In Canada it is even worse. The "unlimited" plan is surging back, but it has automatic throttling at varying points.
Oh and since we're talking Microsoft and Windows, you can't control data usage. Windows 10 uploads your personal information at unknown intervals and downloads ads and sponsored apps without consent.
What good is faster cell data with horrible data limits and an OS with a poor concept of when data is permissible even in the "Enterprise" edition?