• 0 Posts
  • 129 Comments
Joined 2 years ago
cake
Cake day: July 7th, 2024

help-circle

  • There is a variant of classical computers called probabilistic computers where logic gates can randomly perturb a bit. These bits are called p-bits. Since the system is random, you can’t represent it by specifying each bit value individually since you don’t know their bit values. You specify them with a vector where each entry represents the probability of the bit being a particular value. For example, you would represent a single p-bit with a two-vector where the first number represents the probability of 0 and the second the probability of 1.

    You also have to use a single vector for the whole system, called the state vector. If you have two p-bits, you don’t use two two-vectors, but a single four-vector. This is because two p-bits can become statistically correlated with one another, and so you need to represent them together to keep track of correlations. If you represent each p-bit individually, you will lose information about correlations.

    The state vector grows in scale exponentially because it holds the probabilities for all possible outcomes, which for N bits there are 2^N possible outcomes. If we knew the state of the machine at a given time, we could represent it with separate two-vectors for each bit giving us a complexity of 2N (linear) for all the two-vectors combined, but the fact we are ignorant of its state at a given time requires us to represent it with a single vector with a complexity of 2^N (exponential).

    What even is a probability distribution? Such as, if I say, this biased coin has a 25% chance of landing heads and a 75% chance of landing tails. What does that even mean? One answer is that the probability distribution represents an ensemble. An ensemble is an idealized experiment where you flip the same coin an infinite number of times and then distribute the results. Such a distribution should precisely have the ratio 25:75. An experiment ran an infinite number of times will clearly have greater complexity than an experiment just ran one time. The exponential complexity of the statistical description is usually understood to come from the fact that it represents an ensemble and not an individual system.

    It turns out that the “logic” that underlies quantum mechanics can be described the mathematics that looks exactly like the mathematics of normal probability theory, but with the introduction of negative signs. These probabilities that can be negative are called quasi-probabilities. Despite common misconception, imaginary numbers are not necessary for quantum mechanics. You can reproduce all of quantum information science just with the introduction of negative numbers alone.

    When you build out a probability tree, negative numbers allow certain paths to cancel out with other paths that have positive quasi-probabilities. This cannot happen in classical probability theory as probabilities are either positive or zero and thus can only accumulate. This cancelling are called interference effects and is the hallmark of quantum mechanics. Even entanglement effects, such as shown in violations of Bell inequalities, are just interference effects across statistically correlated systems.

    Here is where it gets a bit weird. Recall how we said that the exponential complexity of the state vector of a probabilistic computer is assumed to be derivative of a combination of an infinite number of samples from simpler, linear, deterministic system, which this combination we call an ensemble. It turns out that you can prove that it is impossible for a system described by quasi-probabilities to be decomposed in the same way. There is no simpler, linear, deterministic system, which an infinite number of samples from it can give you a “quasi-ensemble.”

    This means either one of two things. (1) Quasi-probabilistic systems are fundamentally random, unlike classically probabilistic systems, because there simply cannot be a simpler underlying deterministic state, or (2) there does exist an underlying deterministic state, but it has similar complexity to the “quasi-ensemble” itself. That is to say, the underlying, deterministic, and physical state of the system really is exponentially complex. That is to say, as you add N qubits, the complexity of the internal dynamics of the system really does grow by 2^N.

    Either conclusion you draw, the outcome is the same: Unlike classical probability theory where we assume that the exponential complexity of the statistical description is ultimately derivative of our ignorance of an underlying state, in quantum mechanics, you either have to assume such an underlying state does not exist, or that the system really is just exponentially complex, and thus, in either case, you can only work with the exponentially complex description. There is no other description to work with.

    This makes it impossible to efficiently simulate quantum computers with a classical computer, since the underlying complexity grows exponentially with each qubit you have. If you have 300 qubits, then the size of the state vector is 2^300, which is greater than the number of atoms in the observable universe. Whether you believe #1 or #2, the outcome is, again, the same: there is simply no way to break this apart into an infinite number of samples of a linearly complex system. To simulate it correctly and losslessly, you must indeed use a vector of this size, and so a lossless simulation of even 300 qubits on a classical computer is impossible.

    This extra complexity means that the internal dynamics of a quantum computer is much more complicated than a classical computer, much more stuff is “going on” so to speak, and so can in principle get much more compute if you can figure out how to leverage that in a useful way.

    Typically quasi-probabilities aren’t actually used often in practice, because introducing negative signs into classical probability theory breaks L1 normalization unless you double-up the size of the state vector. It is more mathematically concise to also introduce imaginary numbers, which lets you keep the state vector the same size as it is in classical probability theory, but containing complex numbers. These are called probability amplitudes. That is why imaginary numbers are used in quantum mechanics. They are not necessary just more concise. What is absolutely necessary and indispensable is the negative numbers, as these are what allows certain paths on the probability tree to cancel out with other paths.

    Yes, you do just work with a vector and matrices that apply to the vector. The vector either can contain quasi-probabilities or probability amplitudes. But, besides that, it pretty much just works like normal probability theory. Each entry is associated with the likelihood of a particular outcome. If you have 3 qubits, you need an eight-vector because 2^3=8, where the probability amplitudes are associated with the likelihoods of observing 000, 001, 010, 011, 100, 101, 110, or 111 respectively. Unlike stochastic matrices of a classical probabilistic computer, you use unitary matrices, also can contain complex numbers.

    Besides the use of complex numbers, the mathematics is, again, pretty much identical to regular old probability theory.

    It may seem “abstract” because, in classical probability theory, you assume that the state vector is an abstract idealized description of an ensemble, which is different from the underlying physical state. But in quantum mechanics, the state vector cannot possibly be merely an idealized description of an ensemble. Either such an underlying state does not physically exist, or whatever physical state does exist, it must be related to the quantum state vector with similar complexity, and indeed some physicists interpret the quantum state vector to actually be the underlying physical state of the system.

    There is no agreement or academic consensus on how to “interpret” the physical meaning of the mathematics, in terms of natural philosophy. Everyone agrees on the mathematics itself and how to make predictions with it, but if you are to ask what the mathematics actually physically represents, what is “really going on” inside a quantum computer, or, as well called them, what are the “beables” of the theory, this is a rather controversial topic without agreement.

    Personally, I tend to be torn between two different answers to this.

    One possibility is the answer of the physicists Carlo Rovelli and Francois-Igor Pris, which takes position #1, that outcomes are truly random an there is no underlying physical state because physical states are only meaningful when defined relative to another system during an interaction, and so it makes no coherent sense to speak of the particle as having an autonomous state as a property to itself. All the “paradoxes” in quantum mechanics disappear if you stop making absolute statements about particles, like “the spin of the electron is down,” and always instead append to this relative to what.

    Another answer may be something akin to David Bohm’s pilot wave theory which takes position #2, where you assume that there is an underlying, simpler, deterministic state, but that it exists alongside the quantum state. The quantum state is separate thing, a separate “beable,” which influences how the particles behave. This gives you a picture that feels fairly Newtonian. I used to be more skeptical of this approach because the physicist John Bell proved such a picture cannot be compatible with special relativity, which, if true, might make it impossible for such an approach to reproduce the predictions of quantum field theory. However, the physicist Ward Struyve demonstrated that while it is indeed not compatible with special relativity, it is false to draw the conclusion that therefore it cannot reproduce the predictions of quantum field theory, and demonstrated that this is not actually an issue.

    There are many other views in the literature as well.



  • The reason quantum computers are theoretically faster is because of the non-separable nature of quantum systems.

    Imagine you have a classical computer where some logic gates flip bits randomly, and multi-bit logic gates could flip them randomly but in a correlated way. These kinds of computers exist and are called probabilistic computers and you can represent all the bits using a vector and the logic gates with matrices called stochastic matrices.

    The vector necessarily is non-separable, meaning, you cannot get the right predictions if you describe the statistics of the computer with a vector assigned to each p-bit separately, but must assign a single vector to all p-bits taken together. This is because the statistics can become correlated with each other, i.e. the statistics of one p-bit depends upon another, and thus if you describe them using separate vectors you will lose information about the correlations between the p-bits.

    The p-bit vector grows in complexity exponentially as you add more p-bits to the system (complexity = 2^N where N is the number of p-bits), even though the total states of all the p-bits only grows linearly (complexity = 2N). The reason for this is purely an epistemic one. The physical system only grows in complexity linearly, but because we are ignorant of the actual state of the system (2N), we have to consider all possible configurations of the system (2^N) over an infinite number of experiments.

    The exponential complexity arises from considering what physicists call an “ensemble” of individual systems. We are not considering the state of the physical system as it currently exists right now (which only has a complexity of 2N) precisely because we do not know the values of the p-bits, but we are instead considering a statistical distribution which represents repeating the same experiment an infinite number of times and distributing the results, and in such an ensemble the system would take every possible path and thus the ensemble has far more complexity (2^N).

    This is a classical computer with p-bits. What about a quantum computer with q-bits? It turns out that you can represent all of quantum mechanics simply by allowing probability theory to have negative numbers. If you introduce negative numbers, you get what are called quasi-probabilities, and this is enough to reproduce the logic of quantum mechanics.

    You can imagine that quantum computers consist of q-bits that can be either 0 or 1 and logic gates that randomly flip their states, but rather than representing the q-bit in terms of the probability of being 0 or 1, you can represent the qubit with four numbers, the first two associated with its probability of being 0 (summing them together gives you the real probability of 0) and the second two associated with its probability of being 1 (summing them together gives you the real probability of 1).

    Like normal probability theory, the numbers have to all add up to 1, being 100%, but because you have two numbers assigned to each state, you can have some quasi-probabilities be negative while the whole thing still adds up to 100%. (Note: we use two numbers instead of one to describe each state with quasi-probabilities because otherwise the introduction of negative numbers would break L1 normalization, which is a crucial feature to probability theory.)

    Indeed, with that simple modification, the rest of the theory just becomes normal probability theory, and you can do everything you would normally do in normal classical probability theory, such as build probability trees and whatever to predict the behavior of the system.

    However, this is where it gets interesting.

    As we said before, the exponential complexity of classical probability is assumed to merely something epistemic because we are considering an ensemble of systems, even though the physical system in reality only has linear complexity. Yet, it is possible to prove that the exponential complexity of a quasi-probabilistic system cannot be treated as epistemic. There is no classical system with linear complexity where an ensemble of that system will give you quasi-probabilistic behavior.

    As you add more q-bits to a quantum computer, its complexity grows exponentially in a way that is irreducible to linear complexity. In order for a classical computer to keep up, every time an additional q-bit is added, if you want to simulate it on a classical computer, you have to increase the number of bits in a way that grows exponentially. Even after 300 q-bits, that means the complexity would be 2^N = 2^300, which means the number of bits you would need to simulate it would exceed the number of atoms in the observable universe.

    This is what I mean by quantum systems being inherently “non-separable.” You cannot take an exponentially complex quantum system and imagine it as separable into an ensemble of many individual linearly complex systems. Even if it turns out that quantum mechanics is not fundamental and there are deeper deterministic dynamics, the deeper deterministic dynamics must still have exponential complexity for the physical state of the system.

    In practice, this increase in complexity does not mean you can always solve problems faster. The system might be more complex, but it requires clever algorithms to figure out how to actually translate that into problem solving, and currently there are only a handful of known algorithms you can significantly speed up with quantum computers.

    For reference: https://arxiv.org/abs/0711.4770


  • If you have a very noisy quantum communication channel, you can combine a second algorithm called quantum distillation with quantum teleportation to effectively bypass the quantum communication channel and send a qubit over a classical communication channel. That is the main utility I see for it. Basically, very useful for transmitting qubits over a noisy quantum network.


  • The people who named it “quantum teleportation” had in mind Star Trek teleporters which work by “scanning” the object, destroying it, and then beaming the scanned information to another location where it is then reconstructed.

    Quantum teleportation is basically an algorithm that performs a destructive measurement (kind of like “scanning”) of the quantum state of one qubit and then sends the information over a classical communication channel (could even be a beam if you wanted) to another party which can then use that information to reconstruct the quantum state on another qubit.

    The point is that there is still the “beaming” step, i.e. you still have to send the measurement information over a classical channel, which cannot exceed the speed of light.


  • Obvious answer is that the USA is the world’s largest economy while Russia is not, so if USA says “if you trade with Russia then you can’t trade with me” then most countries will happily accept ceasing trade with Russia to remain in the US market but if Russia says the same about the USA then people would just laugh and go trade with the USA.

    The only country that might have some leverage in sanctioning the US is China but China has historically had a “no allies” policy. Chinese leadership hate the idea of that because then they would feel obligated to defend them and defending another country is viewed very poorly in Chinese politics. They thus only ever form trade relations and never alliances, meaning if your country is attacked they have no obligation to you. Chinese politicians may verbally condemn the attack but they won’t do anything like sanctions or even provide their own military support in return.






  • I tried to encourage fellow Linux users to just encourage one distro. It doesn’t have to be a good distro, but just one the person is least likely to run into issues with and if they do, the most likely to be able to find solutions easily for their issues. Things like Ubuntu and Mint clearly fit the bill. They can then decide later if they want to change to a different one based on what they learn from using that one.

    No one listened to me, because everyone wants to recommend their personal favorite distro rather than what would lead to the least problems for the user and would be the easiest to use. A person who loves PopOS will insist the person must use PopOS. A person who loves Manjaro will insist that the person must use Manjaro. Linux users like so many different distros that this just means everyone recommends something different and just make it confusing.

    I gave up even bothering after awhile. Linux will never be big on desktop unless some corporation pushes a Linux-based desktop OS.



  • I use Debian as my daily driver for at least a decade, but I still recommend Mint because it has all the good things about Debian with extra.

    Debian developers just push out kernel updates without warning you about any possible system incompatibilities, so for example if you have an Nvidia GPU you might get a notificaton to “update” and a normie will likely press it only for the PC to boot to a black screen because Debian pushed out a kernel update that breaks compatibility with Nvidia drivers and does nothing to warn the user about it, and then a normie probably won’t know how to get out of the black screen to the TTY and roll back the update.

    I remember this happening before and I had to go to the reddit for /r/Debian and respond to all the people freaking out explaining to them how to fix their system and rollback the update.

    Operating systems like Ubuntu, Mint, PopOS, etc, will do more testing with their kernel before rolling it out to users. They also tend to have more up-to-date kernels. I had Debian on everything but my gaming PC that I had built recently because Debian 12 used such an old kernel that it wouldn’t support my motherboard hardware. This was a kernel-level issue and couldn’t be fixed just by installing a new driver. Normies are not going to want to compile their own kernel for their daily driver, and neither do I who has a lot of experience with Linux.

    I ended up just using Mint until Debian 13 released on that PC because my only option would be to switch to the unstable or testing branch, or compile my own kernel, which neither I cared to do on a PC I just wanted to work and play Horizon or whatever on.


  • I agree, I just think that part of that dissuasion can be voting against them. If there are two pro-genocide parties and one loses because the vote was split with an anti-genocide party, sure that may cause a slightly worse pro-genocide party to come to power, but it may lead to the pro-genocide party that lost because of the split vote dropping the pro-genocide from their platform realizing it is a losing issue next election cycle.

    I do not believe we will ever get radical change from electoralism but I do think you can get minor changes. The people I mostly have a problem with are those who say you should pledge your loyalty to the “lesser evil” party every election cycle because that just guarantees we will get the pro-genocide party every election cycle. To actually break free of that and have some electoral change you have to be willing to lose a few elections or else the cycle will last forever.

    It’s not like backing the pro-genocide candidate even helps you win all the elections, either. The “vote who no matter who” crowd still constantly lose elections and if you look at the numbers, the election losses are almost never because someone split the vote. They insist on backing such bad candidates they lose anyways and then don’t even have leverage over them to push them to the left because they all voted for that candidate.

    Again, I don’t believe we will have some sort of radical fundamental change from electoralism, but you can get minor improvements. We see in western European countries that you can indeed achieve better social services and such through electoralism. But you have to be willing to vote for that, and even the American “progressives” don’t want to ever vote for that.



  • Well, I’m not sure why I’d even be running for a nomination to your “Racism Party™”, but I would be pretty unsurprised when I didn’t win.

    You’re the one advocating to run for genocidal far-right jingoist party.

    I don’t understand why you’d have me running in that party in the first place so I don’t know what answer you’re fishing for here.

    You’re intentionally avoiding the point because you know I am right at this point.

    Now you’re just straight up strawmanning.

    You: “vote blue no matter who.”

    Me: “You’re saying we should vote blue no matter.”

    You: “STRAW MAN STRAW MAN”

    When I read your first post here, I saw your line of thought was pretty thin, but there might be something of substance there. I can see what I thought was substance in your post was a mirage. It was a mistake to waste my time engaging with you.

    This is just copium. You have conceded my entire argument. You cannot uphold the position that we should mindlessly “vote blue no matter who,” so you intentionally avoid the point because you know mindlessly voting for genocidal fascists is not a tenable position.

    There is no point of discussing further as you have already conceded my argument but have too big of an ego to admit it.




  • Much of the voting population still thinks the Democrats are the “good guys” who will save us. Even here on Lemmy, speaking ill of the Democrats often gets me downvoted. The portion of Americans who are actually anti-capitalist is pretty small. Even most the supposed “far leftists” just want to tax billionaires. There anti-capitalist movement in the USA is far too small to be influential, the only real organization being the DSA, but even then the DSA is composed of a mixture of socialists and liberals and so it is not a purely socialist organization.