• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

What do you think about what I think about quantum physics?

neRok

Senior Member
Joined
Oct 7, 2022
Messages
417
Likes
259
Location
Australia
I occasionally watch videos regarding physics, but I try not to dwell on the possibilities too much (coz I've got other things to waste time on). However, I had what seems to be a decent thought the other day, and so perhaps some of you will enjoy making comment? (PS: this post is 0% ChatGPT)

So the other day I was watching the video "Will scientists ever agree on quantum? | Sabine Hossenfelder and Matt O'Dowd FULL TALK", and during the discussion about entanglement and "spooky action at a distance" and all that, I think it was O'Dowd that said something like "it's kind of like the particle knows the outcome of it's measurement at the time of its creation, which means it knows what the future will be"...

And at that point, I had the thought that it's not the "outcome" that needs to be known at the moment of creation, but (in computing terms) actually just the "get_outcome_function" that needs to be passed along with the particle, and then the "outcome" (result) can be computed "lazily" if/when it is needed using the "local state" as a variable.

But if that were the case, then the 2 entangled particles could end up with the same computed outcome, which would be "wrong", and hence the "spooky action" comes in to the mix. But sticking with the lazy_function train of thought, then it's trivial for each function/particle to have a reference/pointer to the other, so that at the time it needs it needs to get its outcome, it can first check if the other particle has already calculated its outcome, and return the opposite if so.

And so there's still the problem regarding the "speed of light" and how that information gets transferred, but if you consider at this point the "simulated universe" theory, then that's a trivial problem; because the information doesn't have to travel particle-to-particle "through" the simulation, but it can simply be transmitted through the "server". For an analogy, consider a computer a computer game that runs at 60fps (60Hz), but that is running on a computer operating at ~4GHz. Thus within 1 "tick" of the simulation, there is plenty of time to transfer/compute that information (and in the "universe simulation", 1 tick would be 1 unit of light-speed-time)

Now I normally don't bother considering the "simulation" theory much, because it's kind of irrelevant in the same way as considering if the universe was created by God or similar. But in regards to this quantum discussion, it seems to "fit nicely". That's because there is also the aspect of quantum physics regarding "many possible outcomes" and "wave functions" etc. And so if you ponder why all these "possibilities" would be "computed" and many discarded, then there's 2 rather simple answers why: 1) It might just be "branch prediction" of the CPU running the simulation; or 2) the "simulator" might not be computationally constrained, but I/O constrained.

So continuing with #2, and in regards to "inefficient" code: if particle_1 needed to get its result, it would have to check if particle_2 had a result, and if not, then calculate and return a result. But what if that "request" between particles was "slow", and it just so happens that you have tons of computing power to spare? Then it would make sense for particle_1 to send of the request to particle_2, and in the mean time, just go ahead and calculate all possible outcomes. That way, once particle_2 has responded, then particle_1 is "ready to go". In programming terms, this is basic sync/blocking vs async style programming, and it's exactly what is happening in your web browser right now.

But why would there be an "I/O constraint"? Well when you consider our computers, there is always a "penalty" to transfer data between 2 different places, like CPU cache to RAM, or RAM to HDD, or from the "cloud" versus a thumb-drive, etc. This implies that there is computation happening at different "places". ..

Now I'm "forking" my own post away from quantum towards a previous "thought" I've had involving "dark matter" — but I'm just going to give the jist: If each galaxy is it's own "simulation", then they may be subordinate to a master/universe simulator. If this is the case, and one universe needed "some" information from another (ie, to render some stars), then you wouldn't want to transfer the "full state" of 1 universe to another, and thus you would send something "simpler" (like what voxels are, and like how game engines take "shortcuts" regarding "global illumination"/lighting). And to jist-the-jist: if we measure universes and they behave like they have more mass than it appears they do (ie, have dark matter), then what if that is simply the difference between `count_object_memory_use(universe_2.screenshot)` and `universe_2.memory_in_use`, and the reported memory isn't "accurate" because it is the memory in use by all objects of the simulation as opposed to objects IN the simulation (i'm implying that it might include "ghost objects", ie objects that have been deleted but not yet "garbage collected").

And my fork doesn't stop there, because black holes can be explained similary. Something about them that stood out to me is the "fact" the event horizon "stores" the "data" of everything that passes it. In short, perhaps: 1) black holes are the "recycle bin", whereby the objects are deleted, but still actually there; OR 2) like when you actually delete a file on your computer, it's probably not "physically" deleted from the harddrive, but it's space just labelled as "free". Thus, the data is still technically there, but not taking up any space...

And now to fork the discussion way back: perhaps it doesn't have to be a "simulation" for the data to transfer at another "level", because can't it just happen in a higher dimension?
Edit: And what if the "deleted" data still exists in that higher dimension (like bits still physically recorded on a harddrive), and thus have "some effect" (ie, they are the dark matter. and how they cause effects is akin to "buffer overflows").

Anyway, please enjoy my ramblings.
 
Last edited:
While audio forum, I do find posts on variety of topics interesting - at least to some. I don't click bait to every post as title usually tells me at least what to expect.

I do find technology fascinating and for sure quantum computing is at the door. While I could (somehow) understand the fundamentals of AI tech, I do admit that I am peril with anything quantum related so will need to spend some time getting there.

I always thought that we will extinguish ourselves by stupid laws and rules, but now I see that technology is the prevalent factor, not to say that we will not find a wrong way to implement it.
 
While audio forum, I do find posts on variety of topics interesting - at least to some. I don't click bait to every post as title usually tells me at least what to expect.

I do find technology fascinating and for sure quantum computing is at the door. While I could (somehow) understand the fundamentals of AI tech, I do admit that I am peril with anything quantum related so will need to spend some time getting there.

I always thought that we will extinguish ourselves by stupid laws and rules, but now I see that technology is the prevalent factor, not to say that we will not find a wrong way to implement it.
+1
 
While audio forum, I do find posts on variety of topics interesting - at least to some. I don't click bait to every post as title usually tells me at least what to expect.

I do find technology fascinating and for sure quantum computing is at the door. While I could (somehow) understand the fundamentals of AI tech, I do admit that I am peril with anything quantum related so will need to spend some time getting there.

I always thought that we will extinguish ourselves by stupid laws and rules, but now I see that technology is the prevalent factor, not to say that we will not find a wrong way to implement it.
Maybe this site could have a tab titled "Off Topic Discussions", and non-audio content could be posted there. I do come here with the intent to read & discuss audio topics, not quantum mechanics, of which I am too ignorant to understand due to its vast complexity and weirdness.
 
Actually, I try not to think about physics of any kind, but particularly Quantum Physics. :)

Time to feed the chickens!

Regards
 
Maybe this site could have a tab titled "Off Topic Discussions", and non-audio content could be posted there. I do come here with the intent to read & discuss audio topics, not quantum mechanics, of which I am too ignorant to understand due to its vast complexity and weirdness.
I am fine the way it is, or the way you suggest.

I thought I would be in easy retirement soon, but then things just keep popping up. Quantum computing is at our door. Ignoring it is probably the worst option. It will for sure change our lives once everybody gets done with the AI frenzy. Then there will be merging of AI and Quantum, so who will be able to figure out what is going on?
 
I occasionally watch videos regarding physics, but I try not to dwell on the possibilities too much (coz I've got other things to waste time on). However, I had what seems to be a decent thought the other day, and so perhaps some of you will enjoy making comment? (PS: this post is 0% ChatGPT)

So the other day I was watching the video "Will scientists ever agree on quantum? | Sabine Hossenfelder and Matt O'Dowd FULL TALK", and during the discussion about entanglement and "spooky action at a distance" and all that, I think it was O'Dowd that said something like "it's kind of like the particle knows the outcome of it's measurement at the time of its creation, which means it knows what the future will be"...

And at that point, I had the thought that it's not the "outcome" that needs to be known at the moment of creation, but (in computing terms) actually just the "get_outcome_function" that needs to be passed along with the particle, and then the "outcome" (result) can be computed "lazily" if/when it is needed using the "local state" as a variable.

But if that were the case, then the 2 entangled particles could end up with the same computed outcome, which would be "wrong", and hence the "spooky action" comes in to the mix. But sticking with the lazy_function train of thought, then it's trivial for each function/particle to have a reference/pointer to the other, so that at the time it needs it needs to get its outcome, it can first check if the other particle has already calculated its outcome, and return the opposite if so.

And so there's still the problem regarding the "speed of light" and how that information gets transferred, but if you consider at this point the "simulated universe" theory, then that's a trivial problem; because the information doesn't have to travel particle-to-particle "through" the simulation, but it can simply be transmitted through the "server". For an analogy, consider a computer a computer game that runs at 60fps (60Hz), but that is running on a computer operating at ~4GHz. Thus within 1 "tick" of the simulation, there is plenty of time to transfer/compute that information (and in the "universe simulation", 1 tick would be 1 unit of light-speed-time)

Now I normally don't bother considering the "simulation" theory much, because it's kind of irrelevant in the same way as considering if the universe was created by God or similar. But in regards to this quantum discussion, it seems to "fit nicely". That's because there is also the aspect of quantum physics regarding "many possible outcomes" and "wave functions" etc. And so if you ponder why all these "possibilities" would be "computed" and many discarded, then there's 2 rather simple answers why: 1) It might just be "branch prediction" of the CPU running the simulation; or 2) the "simulator" might not be computationally constrained, but I/O constrained.

So continuing with #2, and in regards to "inefficient" code: if particle_1 needed to get its result, it would have to check if particle_2 had a result, and if not, then calculate and return a result. But what if that "request" between particles was "slow", and it just so happens that you have tons of computing power to spare? Then it would make sense for particle_1 to send of the request to particle_2, and in the mean time, just go ahead and calculate all possible outcomes. That way, once particle_2 has responded, then particle_1 is "ready to go". In programming terms, this is basic sync/blocking vs async style programming, and it's exactly what is happening in your web browser right now.

But why would there be an "I/O constraint"? Well when you consider our computers, there is always a "penalty" to transfer data between 2 different places, like CPU cache to RAM, or RAM to HDD, or from the "cloud" versus a thumb-drive, etc. This implies that there is computation happening at different "places". ..

Now I'm "forking" my own post away from quantum towards a previous "thought" I've had involving "dark matter" — but I'm just going to give the jist: If each galaxy is it's own "simulation", then they may be subordinate to a master/universe simulator. If this is the case, and one universe needed "some" information from another (ie, to render some stars), then you wouldn't want to transfer the "full state" of 1 universe to another, and thus you would send something "simpler" (like what voxels are, and like how game engines take "shortcuts" regarding "global illumination"/lighting). And to jist-the-jist: if we measure universes and they behave like they have more mass than it appears they do (ie, have dark matter), then what if that is simply the difference between `count_object_memory_use(universe_2.screenshot)` and `universe_2.memory_in_use`, and the reported memory isn't "accurate" because it is the memory in use by all objects of the simulation as opposed to objects IN the simulation (i'm implying that it might include "ghost objects", ie objects that have been deleted but not yet "garbage collected").

And my fork doesn't stop there, because black holes can be explained similary. Something about them that stood out to me is the "fact" the event horizon "stores" the "data" of everything that passes it. In short, perhaps: 1) black holes are the "recycle bin", whereby the objects are deleted, but still actually there; OR 2) like when you actually delete a file on your computer, it's probably not "physically" deleted from the harddrive, but it's space just labelled as "free". Thus, the data is still technically there, but not taking up any space...

And now to fork the discussion way back: perhaps it doesn't have to be a "simulation" for the data to transfer at another "level", because can't it just happen in a higher dimension?
Edit: And what if the "deleted" data still exists in that higher dimension (like bits still physically recorded on a harddrive), and thus have "some effect" (ie, they are the dark matter. and how they cause effects is akin to "buffer overflows").

Anyway, please enjoy my ramblings.

I agree this is a fascinating topic.

As to your thoughts on it, I think your computing analogy is an interesting and useful tool to consider the questions and challenges posed by quantum physics.

With that said, your analogy illustrates something that Sabine Hossenfelder routinely says as a critique of modern theoretical physics - it’s not doing science, because it’s not proposing testable hypotheses but rather imagining various, similarly possible universes that could explain (or explain away) the mysteries and dilemmas created by our testable scientific observations. Sabine’s videos rate such speculations (when these speculations claim to be robust scientific hypotheses) on a “Bulls**t Meter.” Other physicists more diplomatically - and I’d say more accurately - call these speculations philosophy, as distinct from science.

So I think your ideas are are interesting - and fun! - to consider. But like too many theoretical-physics books and scientific papers these days, such ideas tell us nothing about the foundations of the physical world we live in.
 
I am fine the way it is, or the way you suggest.

I thought I would be in easy retirement soon, but then things just keep popping up. Quantum computing is at our door. Ignoring it is probably the worst option. It will for sure change our lives once everybody gets done with the AI frenzy. Then there will be merging of AI and Quantum, so who will be able to figure out what is going on?
I certainly wouldn't ignore it, but I would not attempt to understand it. I would be open to using functions that it enables & presents to users in an approachable fashion.
 
I certainly wouldn't ignore it, but I would not attempt to understand it. I would be open to using functions that it enables & presents to users in an approachable fashion.
Then let's hope that someone comes up with the manual for the less inclined to tech. I would certainly love that.

I do have some really bad scenarios running in my head, but no need to scare the crowd (much) at this point. The backbone of the argument is that Quantum physics is really complex so not many people can understand it. It requires a really high IQ, that I am afraid is higher than what I have - but will give it my best shot.

Then, the whole concept will be likely implemented in the Master-Servant framework which will allow the Masters to control the whole scheme, even if not necessarily understanding completely what it is or what it does. Masters will obviously want to profit from the scheme. Think Wall Street and Canary Warf.
 
I occasionally watch videos regarding physics, but I try not to dwell on the possibilities too much (coz I've got other things to waste time on). However, I had what seems to be a decent thought the other day, and so perhaps some of you will enjoy making comment? (PS: this post is 0% ChatGPT)

So the other day I was watching the video "Will scientists ever agree on quantum? | Sabine Hossenfelder and Matt O'Dowd FULL TALK", and during the discussion about entanglement and "spooky action at a distance" and all that, I think it was O'Dowd that said something like "it's kind of like the particle knows the outcome of it's measurement at the time of its creation, which means it knows what the future will be"...

And at that point, I had the thought that it's not the "outcome" that needs to be known at the moment of creation, but (in computing terms) actually just the "get_outcome_function" that needs to be passed along with the particle, and then the "outcome" (result) can be computed "lazily" if/when it is needed using the "local state" as a variable.

But if that were the case, then the 2 entangled particles could end up with the same computed outcome, which would be "wrong", and hence the "spooky action" comes in to the mix. But sticking with the lazy_function train of thought, then it's trivial for each function/particle to have a reference/pointer to the other, so that at the time it needs it needs to get its outcome, it can first check if the other particle has already calculated its outcome, and return the opposite if so.

And so there's still the problem regarding the "speed of light" and how that information gets transferred, but if you consider at this point the "simulated universe" theory, then that's a trivial problem; because the information doesn't have to travel particle-to-particle "through" the simulation, but it can simply be transmitted through the "server". For an analogy, consider a computer a computer game that runs at 60fps (60Hz), but that is running on a computer operating at ~4GHz. Thus within 1 "tick" of the simulation, there is plenty of time to transfer/compute that information (and in the "universe simulation", 1 tick would be 1 unit of light-speed-time)

Now I normally don't bother considering the "simulation" theory much, because it's kind of irrelevant in the same way as considering if the universe was created by God or similar. But in regards to this quantum discussion, it seems to "fit nicely". That's because there is also the aspect of quantum physics regarding "many possible outcomes" and "wave functions" etc. And so if you ponder why all these "possibilities" would be "computed" and many discarded, then there's 2 rather simple answers why: 1) It might just be "branch prediction" of the CPU running the simulation; or 2) the "simulator" might not be computationally constrained, but I/O constrained.

So continuing with #2, and in regards to "inefficient" code: if particle_1 needed to get its result, it would have to check if particle_2 had a result, and if not, then calculate and return a result. But what if that "request" between particles was "slow", and it just so happens that you have tons of computing power to spare? Then it would make sense for particle_1 to send of the request to particle_2, and in the mean time, just go ahead and calculate all possible outcomes. That way, once particle_2 has responded, then particle_1 is "ready to go". In programming terms, this is basic sync/blocking vs async style programming, and it's exactly what is happening in your web browser right now.

But why would there be an "I/O constraint"? Well when you consider our computers, there is always a "penalty" to transfer data between 2 different places, like CPU cache to RAM, or RAM to HDD, or from the "cloud" versus a thumb-drive, etc. This implies that there is computation happening at different "places". ..

Now I'm "forking" my own post away from quantum towards a previous "thought" I've had involving "dark matter" — but I'm just going to give the jist: If each galaxy is it's own "simulation", then they may be subordinate to a master/universe simulator. If this is the case, and one universe needed "some" information from another (ie, to render some stars), then you wouldn't want to transfer the "full state" of 1 universe to another, and thus you would send something "simpler" (like what voxels are, and like how game engines take "shortcuts" regarding "global illumination"/lighting). And to jist-the-jist: if we measure universes and they behave like they have more mass than it appears they do (ie, have dark matter), then what if that is simply the difference between `count_object_memory_use(universe_2.screenshot)` and `universe_2.memory_in_use`, and the reported memory isn't "accurate" because it is the memory in use by all objects of the simulation as opposed to objects IN the simulation (i'm implying that it might include "ghost objects", ie objects that have been deleted but not yet "garbage collected").

And my fork doesn't stop there, because black holes can be explained similary. Something about them that stood out to me is the "fact" the event horizon "stores" the "data" of everything that passes it. In short, perhaps: 1) black holes are the "recycle bin", whereby the objects are deleted, but still actually there; OR 2) like when you actually delete a file on your computer, it's probably not "physically" deleted from the harddrive, but it's space just labelled as "free". Thus, the data is still technically there, but not taking up any space...

And now to fork the discussion way back: perhaps it doesn't have to be a "simulation" for the data to transfer at another "level", because can't it just happen in a higher dimension?
Edit: And what if the "deleted" data still exists in that higher dimension (like bits still physically recorded on a harddrive), and thus have "some effect" (ie, they are the dark matter. and how they cause effects is akin to "buffer overflows").

Anyway, please enjoy my ramblings.
Please express all of this in mathematical equations and then we’ll talk.
 
So, could we record music with some type of quantum physics, where when its played back it would know what the room is like and adjust ?
Every semiconductor already relies on quantum mechanics to work, so yes, we can, we do ;)
 
A friend of mine pointed me to Sabine's videos. I find her positions interesting and sometimes compelling. Unfortunately, some of her assertions take more knowledge of physics and its research field than I have to judge in any substantial way.

As someone who worked in various aspects of systems and computer engineering in my career, I feel compelled to understand quantum computing, and that leads me to try to understand quantum physics. Not being a physicist, I find it sometimes difficult to understand the fundamentals of what I'm reading.

Quantum computing doesn't work without entanglement, and we don't seem to have settled science in the physics community for how entanglement works. I was at a dinner some time back with a small group of working physicists. I thought that was a great time to get some of my questions answered. One topic led to another, and finally I brought up the issue of entanglement, of course mentioning that quantum computers don't function without entanglement, yet I had not seen a working theory for how entanglement works. Is there one? I got that look from all four of them that one usually sees when a person brings up a completely inappropriate topic in polite company. Like politics or religion. After the moment of silence I asked, "Since we know entanglement exists, because quantum computers, however simple, won't function without entanglement, and other experiments have proven faster than light reaction speeds, doesn't entanglement break modern theories of physics?" The reactions I got were priceless. Mostly rolling eyes, and looks between them like I said something really stupid or crossed some boundaries.

One physicist finally responded. "Relativistic physics theory and quantum physics theory are both incomplete." I could tell no further questions would be appropriate. :) I felt like I stepped in something smelly and squishy.

Later on I learned there was a popular theory floating around called ER = EPR, which essentially means that entangled particles "communicate" via wormholes. Recently a friend who works for a quantum computing company (in software, not hardware) sent me this link. The paper the link refers to can be downloaded as a PDF for free. It is thought provoking, at least if you're like me and not an expert.

 
A friend of mine pointed me to Sabine's videos. I find her positions interesting and sometimes compelling. Unfortunately, some of her assertions take more knowledge of physics and its research field than I have to judge in any substantial way.

As someone who worked in various aspects of systems and computer engineering in my career, I feel compelled to understand quantum computing, and that leads me to try to understand quantum physics. Not being a physicist, I find it sometimes difficult to understand the fundamentals of what I'm reading.

Quantum computing doesn't work without entanglement, and we don't seem to have settled science in the physics community for how entanglement works. I was at a dinner some time back with a small group of working physicists. I thought that was a great time to get some of my questions answered. One topic led to another, and finally I brought up the issue of entanglement, of course mentioning that quantum computers don't function without entanglement, yet I had not seen a working theory for how entanglement works. Is there one? I got that look from all four of them that one usually sees when a person brings up a completely inappropriate topic in polite company. Like politics or religion. After the moment of silence I asked, "Since we know entanglement exists, because quantum computers, however simple, won't function without entanglement, and other experiments have proven faster than light reaction speeds, doesn't entanglement break modern theories of physics?" The reactions I got were priceless. Mostly rolling eyes, and looks between them like I said something really stupid or crossed some boundaries.

One physicist finally responded. "Relativistic physics theory and quantum physics theory are both incomplete." I could tell no further questions would be appropriate. :) I felt like I stepped in something smelly and squishy.

Later on I learned there was a popular theory floating around called ER = EPR, which essentially means that entangled particles "communicate" via wormholes. Recently a friend who works for a quantum computing company (in software, not hardware) sent me this link. The paper the link refers to can be downloaded as a PDF for free. It is thought provoking, at least if you're like me and not an expert.

Incompleteness is a good assumption with any scientific theory. Room for improvement for every theory is built in to science.
 
Incompleteness is a good assumption with any scientific theory. Room for improvement for every theory is built in to science.
Yeah, I know, but I was at least looking for an interesting discussion, not eye rolls. :) I just wanted to learn.
 
Sir, this is an audio forum.
1759000031349.png

And just under that circled Forum area... I quote
Don't be an audio nerd. Show that you are into something other than fetal position in a dark room in front of a couple of speakers.

Peace
 
While you're all here nerding, let me give you a good investment idea:
1000004462.jpg

It's excellent.
 
A friend of mine pointed me to Sabine's videos. I find her positions interesting and sometimes compelling. Unfortunately, some of her assertions take more knowledge of physics and its research field than I have to judge in any substantial way.

As someone who worked in various aspects of systems and computer engineering in my career, I feel compelled to understand quantum computing, and that leads me to try to understand quantum physics. Not being a physicist, I find it sometimes difficult to understand the fundamentals of what I'm reading.

Quantum computing doesn't work without entanglement, and we don't seem to have settled science in the physics community for how entanglement works. I was at a dinner some time back with a small group of working physicists. I thought that was a great time to get some of my questions answered. One topic led to another, and finally I brought up the issue of entanglement, of course mentioning that quantum computers don't function without entanglement, yet I had not seen a working theory for how entanglement works. Is there one? I got that look from all four of them that one usually sees when a person brings up a completely inappropriate topic in polite company. Like politics or religion. After the moment of silence I asked, "Since we know entanglement exists, because quantum computers, however simple, won't function without entanglement, and other experiments have proven faster than light reaction speeds, doesn't entanglement break modern theories of physics?" The reactions I got were priceless. Mostly rolling eyes, and looks between them like I said something really stupid or crossed some boundaries.

One physicist finally responded. "Relativistic physics theory and quantum physics theory are both incomplete." I could tell no further questions would be appropriate. :) I felt like I stepped in something smelly and squishy.

Later on I learned there was a popular theory floating around called ER = EPR, which essentially means that entangled particles "communicate" via wormholes. Recently a friend who works for a quantum computing company (in software, not hardware) sent me this link. The paper the link refers to can be downloaded as a PDF for free. It is thought provoking, at least if you're like me and not an expert.

Quantum entanglement is a fairly well known and well understood phenomenon. You may consult, for instance, Wikipedia for more information: https://en.wikipedia.org/wiki/Quantum_entanglement.
 
Back
Top Bottom