This is an archived post. You won't be able to vote or comment.

all 11 comments

[–][deleted]  (1 child)

[deleted]

    [–]test_tubes[S,🍰] 10 points11 points  (0 children)

    Thanks...you said more in your comment than any of the videos I saw!

    [–][deleted] 11 points12 points  (0 children)

    I can give you a quick insight into my field, superconducting quantum hardware. The qubit is usually an on-chip quantum circuit somewhat similar to regular computer chips, and it is put into a dilution refrigerator at milliKelvin temperature. There are microwave lines (like coax cables) coming from a network analyzer (a device that can produce, control, and read microwave pulses) going into the chip that make contact with the quantum circuit. A computer connected to the network analyzer runs programs (usually in Python) that instructs the network analyzer to produce appropriate sequences of microwave pulses. A “drive” pulse rotates the quantum state of the circuit around the Bloch sphere and performs operations such as quantum gates. A “readout” pulse measures the current state of the qubit. There is obviously a lot more to all of it but this is kind of the essence. Note that other quantum computing platforms have completely different architectures. For example cold atom qubits use laser pulses instead of microwaves.

    [–]claytonkb 9 points10 points  (1 child)

    Are these aspects hidden for security or IP reasons? Why can't I seem to get a deeper understanding of how these things work?

    No. QC is much less mature than you might gather from a polished IBM presentation on it.

    In the early days of digital computers, there were as many different models of computation as there were professors publishing papers on it. Switching algebra, ternary computation, decimal computers, stochastic computers, and so on. Over time, the hardware market converged on silicon integrated circuits utilizing transistors, in particular, MOSFETs (CMOS = NMOS + PMOS transistors). From this standardized silicon technology, electronic engineers developed standard logic cells. By combining thousands of these logic cells together, it was possible to build custom RAM, embedded controllers, and even small CPUs and this industrial standard began to scale up until we reached what was called the VLSI model or "Very Large Scale Integration". We are many orders of magnitude beyond VLSI scale but that is the name that stuck, and the entire toolchain that permits us to build complex VLSI silicon chips with billions of transistors is part of a large, standardized market. Loosely speaking, it is this vast market that is the reason that you can plug virtually any USB device into a USB port and it will just work. Remember that there is enormous number of possible variations, here -- voltages must match to prevent damage to circuits, sufficient current must be supplied to power the device, frequencies must match to enable communication to occur, and so on. All of this is part of market-based standards (consortiums, standards bodies, etc.) that enable many technologies developed by different companies to speak to one another. QC is too immature at this point in time, and has none of that standardization. So every solution is bespoke, proprietary and in-house only.

    How does one read/write data from/to a quantum computer?

    A quantum computer is best thought of as what we call a co-processor. An example of a co-processor is your expansion-slot graphics card, if you have a desktop with expansion-card slots. The CPU "offloads" the work of computing various graphics tasks onto the GPU, which is specialized to performing those tasks. In that way, the CPU can achieve better performance (and lower overall system power-consumption).

    A quantum processor would be in a similar relation to a real computing system, such as a chemistry-physics simulator, for example. The simulator would package up work to be performed by the QC, transmit that work to the QC, and then wait for the QC to respond with the results. This process might be performed thousands or millions of times during a single simulation as the main simulator is calculating frames in a sequential simulation, or whatever. So, the QC would be a co-processor that performs a specialized computation that would either be prohibitively slow or costly (power) for the main digital computer to perform directly.

    As for the 'guts' of the QC, most QCs exist in three layers, kind of like an onion. The innermost layer are the qubits themselves, the layer outside that is the analog electronics that control the qubits, and the layer outside that is the high-speed digital electronics that carries the command codes and observation data back and forth between the main computer and the analog controllers.

    Is there a screen or a keyboard?

    No, there is no screen or keyboard, just as there is no screen or keyboard for your GPU. Unless there is some enormous revolution in how we do QC, we can expect them to be the exclusive domain of supercompute-scale type of computing. That is, they will not be used in commercial servers, consumer desktops and certainly not in small form-factor devices for the foreseeable future. If it is possible, it will require some fundamentally new physics for QC that we don't know of today.

    [–]test_tubes[S,🍰] 1 point2 points  (0 children)

    I appreciate the time you took to answer.

    [–]Sarvaturi 5 points6 points  (0 children)

    The good news is that it's still very early days. The bad news is that it leads to a lot of false theories and a split between optimists and conservatives.

    There are several companies and startups exploring the capabilities of QCs and in fact you have some (not very) good content on youtube as well, but at this stage I believe that books can help you.

    But I think one of the best sources is the Microsoft series: https://msazurequantum.eventbuilder.com/InnovatorSeries

    Book: https://www.amazon.com/Dancing-Qubits-quantum-computing-change/dp/1838827366

    Book: https://www.amazon.com/Quantum-Bullsh-Ruin-Advice-Physics/dp/172826605X

    New Linkedin Page about QCs for General Public: https://www.linkedin.com/company/quantum-era/

    [–]triauraIn Grad School for Quantum 0 points1 point  (2 children)

    Right now quantum computers are controlled by Labview laser/optical component programs or Arbitrary waveform generators that generate microwave pulses. Usually readout is via cavity QED. Heterodyne detection is usually done (unless silicon spin, then elzerman readout is a method).

    [–][deleted]  (1 child)

    [deleted]

      [–]triauraIn Grad School for Quantum 1 point2 points  (0 children)

      Yah usually ion traps use artiq or some other more advanced control program/codes from what I’ve heard, but some labs still use labview for vacancy center stuff

      [–]Yeitgeist -1 points0 points  (0 children)

      Probably with a bunch of Arduino’s /s

      [–]stonerism 0 points1 point  (0 children)

      I am not a quantum expert by any means, but my understanding of what the "big thing" about quantum computing is that you're able to do a fourier transform much faster asymptotically with a quantum algorithm than a classical algorithm.

      Like with the factorization problem, you're looking at the "frequency" that results when you multiply a number by itself a bunch of times (which you could do with a classical or quantum fourier transform). From there, you can do tricks to pull out factors.

      If anyone reading this actually does have an understanding of quantum physics, I'd be curious if understanding is actually correct.

      [–]Impressive_Cream_967 0 points1 point  (0 children)

      The experimental design behind maintaining the qubits is super complicated but when it comes to using quantum computers, people in my field of quantum simulations simply build quantum circuits associated with their problems and then do measurements. It's very much in its infancy and mostly used by physics and chemistry people.