Science
Related: About this forumNew advances promise secure quantum computing at home -- phys.org
https://phys.org/news/2024-04-advances-quantum-home.htmlThis is basically describing a technology that would allow people to control their own data while using computing power in the cloud. Currently we are required to upload (albeit perhaps invisibly) our information to the cloud for processing. This includes personal information (PII, PHI), financials, documents, etc.
Quantum computing is developing rapidly, paving the way for new applications that could transform services in many areas like health care and financial services. It works in a fundamentally different way than conventional computing and is potentially far more powerful. However, it currently requires controlled conditions to remain stable and there are concerns around data authenticity and the effectiveness of current security and encryption systems.
...
The results could ultimately lead to commercial development of devices to plug into laptops, to safeguard data when people are using quantum cloud computing services.
bucolic_frolic
(47,016 posts)I've been reading for ten minutes and all I learned was it's below the level of individual atoms, there's lots of data, but not much else.
erronis
(16,875 posts)and not easily absorbed by us mere mortals who are used to the more Newtonian world.
The main driver of this technology, if I understand it at all, is that information can be transmitted via entangled particles so that a change at one end (say your desktop) of the pipeline causes a near-instant change at the other (say the computing server).
I haven't read this but it popped up at the top of my search results:
https://en.wikipedia.org/wiki/Quantum_entanglement
caraher
(6,308 posts)There are correlations that change basically instantaneously but it's a well-known result in the field that you cannot use those changes to transmit information.
What quantum computing does allow is for a sort of supercharged parallelism that, for example, allows for new algorithms such that the way the computational time scales differently with the size of the computational task.
The headline is vastly over-hyped; this is a proof of principle experiment that remains quite a way off from applications. Not to say that it isn't an important step... just don't expect to use this technology anytime soon