tag:blogger.com,1999:blog-6753820.post115859321274697938..comments2023-10-30T04:16:25.917-04:00Comments on Sentient Developments: Jenkins on the ethics of historical simulationsGeorgehttp://www.blogger.com/profile/13003484633933455827noreply@blogger.comBlogger4125tag:blogger.com,1999:blog-6753820.post-74481140058874329802009-05-15T22:34:00.000-04:002009-05-15T22:34:00.000-04:00When I graduate to PostHuman status one of the fir...When I graduate to PostHuman status one of the first things I want to do is simulate possible versions of my life, filling in the "what if" scenarios. Anyway, that's my reality taken care of. Not quite sure what the rest of you are doing here...Dirk Bruerehttps://www.blogger.com/profile/00584889005839719276noreply@blogger.comtag:blogger.com,1999:blog-6753820.post-1158923040041035152006-09-22T07:04:00.000-04:002006-09-22T07:04:00.000-04:00As many of those who posted comments, I agree that...As many of those who posted comments, I agree that we may well be living in a simulation running on some supercomputer in "a higher level of reality". But I don't think we have enough information to assign any probability to this possibility, and I don't agree with the conclusion that the simulation would probably be terminated as soon as its conscious inhabitants develop the capability to run their own equivalent simulations of "lower levels of reality". This would make the original simulation more interesting, wouldn't it? Creating an endless cascade of realities may even be the *objective* of the original simulation.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6753820.post-1158727969683987832006-09-20T00:52:00.000-04:002006-09-20T00:52:00.000-04:00George, Michael and especially Robert - thanks for...George, Michael and especially Robert - thanks for your insightful comments. I have posted a summary and reply on my <A HREF="http://petabytes.typepad.com/blog/2006/09/historical_simu.html" REL="nofollow">blog.</A> Looking forward to seeing you at the Riv this weekend, George!Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6753820.post-1158676255020634552006-09-19T10:30:00.000-04:002006-09-19T10:30:00.000-04:00Robert Bradbury responds:On 9/18/06, George Dvorsk...Robert Bradbury responds:<BR/><BR/>On 9/18/06, George Dvorsky Michael Anissimov wrote:<BR/><BR/><BR/> "Why would our Simulation Overlords terminate our simulation just because we make another one? It doesn't cost more computing power." <BR/><BR/><BR/>It depends entirely what the purpose of the simulation is. Simulating a human brain using current technology computers is significantly less efficient in terms of matter and energy resources than running the brain on the actual hardware of this universe (atoms and molecules) [1]. If we had a "at the limits of physics" brain design [2] simulating such a brain on anything less than the physical instantiation itself is going to be significantly slower than actually building and running that brain.<BR/><BR/>If this reality is a simulation is to study evolutionary paths of advanced civilizations up to the point where they begin to simulate the evolutionary paths of advanced civilizations then we may soon (within decades) be "suspended". If this reality is a simulation to explore the feasibility of designing computers and running simulations based on femtotechnology then we may have much longer future ahead of us [3].<BR/><BR/> "From the physics perspective, a box that is just a box and a box with a mini-universe obeys the same laws and requires the same amount of computation."<BR/><BR/><BR/>"Same amount of computation" is the part which is inaccurate. One of the first "cool" tasks I had as a programmer (back when I was somewhat younger than Michael) was to write a computer simulator [4]. The only way simulations run nearly as efficiently as what is being simulated is if the underlying hardware architecture is explicitly designed for simulations. It currently appears as if the hardware architecture of this universe is not designed to easily enable simulations [5].<BR/><BR/> "Robert Bradbury is wrong, a simulation would not necessary run slower than the real world... the speed of a world is determined by the processing of the minds in it, not anything inherent about the world itself." <BR/><BR/><BR/>I love being wrong, but you have to prove it to me... I think what Michael is wrestling with is whether or not the simulation manages to develop abstractions (shortcuts) that enable computational speedups which exceed the reduction in speed inherent in the simulation. This is not unreasonable. For example we use computers as a shortcut for doing repetitious arithmetic because the natural brain does it so slowly. We develop laws of physics that allow us to compute results directly rather than having to simulate them. Then the question becomes whether or not the abstractions or shortcuts developed in the simulation can be translated back into the reality that produced the simulation. Or are the rules there so different that extracting the inventions of a simulation is impossible? [6,7].<BR/> <BR/><BR/> "We can build a simulation in nanocomputers composed to beings that think a million times faster than us, therefore the "world" can be said to be moving a million times as fast. That's not slower, now is it?" <BR/><BR/><BR/>Ah, this is where things become confused. One can build computers using limits of physics hardware in *this* reality. They will run faster [8]. But simulating a limits of physics computational engine ( i.e. simulating the nanocomputer on current computers or even simulating a nanocomputer on a nanocomputer) is unlikely to run faster or be more efficient than using the "real" thing. As mentioned previously, only if the hardware is intentionally engineered to facilitate simulations will it run at close to non-simulated speeds or with non-simulated efficiency [9].<BR/><BR/>Robert<BR/><BR/>1. This gets into Seth Lloyd's perspective that one can think of "this" universe as simply one very large quantum computer (in that the "instruction set" is the equations of physics, esp. quantum mechanics).<BR/>2. For many, perhaps most, types of computations this is exactly what a Matrioshka Brain is at solar system scales unless engineering femtotechnology computers is feasible.<BR/>3. One can build a Matrioshka Brain in very short time scales once one has the base level of nanoengineering skills -- but optimization of MBrains or MBrain collectives (KT-II to KT-III civilizations) which is what may be required to explore "femotoreality" can take millions to billions of years [unless we find ways to rewrite the hardware rules for this universe]).<BR/>4. We simulated a 36-bit PDP-10 mainframe on a 16-bit PDP-11 -- so a single "add" instruction on the PDP-10 required 6 instructions on the PDP-11. The overhead of packing and unpacking 36 bit instructions and data pushed this up to at least 10x slower and the larger virtual memory system of the PDP-10 added probably another order of magnitude reduction in speed. So a compilation that took minutes on the PDP-10 would take hours on the PDP-11. But obtaining PDP-10 time was difficult and/or expensive while the PDP-11 time was "free" so the exercise was justified.<BR/>5. We could probably digress into a long debate about whether quantum computers will serve this precise function. But current molecular dynamics simulations require hours of supercomputer time to get nanoseconds of "real" time. So at least currently the hardware architecture of this universe does not appear to be easily simulated.<BR/>6. One example that comes to mind is human languages. Though rules for grammar may be built into all humans there are aspects of some languages which are so specific to that culture (in fact the language may "dictate" that cultural reality) that they cannot be translated back into different cultures.<BR/>7. This raises the interesting question of whether "gods" go to the trouble of only running simulations (creating realities) which explicitly allow easy abstraction extraction or whether one intentionally designs reality phase spaces from which extraction is difficult or impossible?<BR/>8. One has to be very careful about this. Nanosystems, pg 370, "A more modest 10 W system can deliver ~10^11 MIPS." The brain is ~10W and ~10^15 OPS so on a power per op basis the nanocomputer is only about 100x the processing capacity. But the nanocomputer is significantly smaller than the brain so you get a significant speedup due to a reduction in communication delays. One can of course get a 10^21 OP nanocomputer (1 million times speedup) in a much smaller volume than the brain but it requires 100,000 W (as well as a radiator significantly larger than the nanocomputer[!]).<BR/>9. The nanocomputer one commonly discusses is Drexler's rod-logic nanocomputer. That is a general purpose "mechanical" computer which would presumably execute some kind of general purpose computer instruction set. To the best of my knowledge nobody has designed computer hardware optimized for "reality" simulations (one wants to execute the laws of the fundamental laws of physics as efficiently as possible). Current computer graphics chips and the IBM cell processor are closer to what is required than general purpose microprocessors (because they are optimized to deal with aspects of physical reality). Some companies are starting to produce chips optimized for laws of physics or gate arrays that can be reprogrammed for these purposes -- but to the best of my knowledge nobody has tried to design a limits of physics "nanocomputer" optimized for this task. Perhaps because atoms and small molecules already satisfy this need.Georgehttps://www.blogger.com/profile/13003484633933455827noreply@blogger.com