It depends on what your personal definition of "quantum computer" is I guess - but these aren't simulations.
There have been "real" multi-qubit systems since at least 2009 when Yale ran some very basic algorithms on a 2 qubit system (http://www.nature.com/nature/journal/v460/n7252/pdf/nature08121.pdf).
In 2009 there was a 4 qubit system that factored numbers using Shors Algorithm (http://arxiv.org/pdf/1111.3726v1.pdf).
I've not heard of a 5 qubit system, but it's not a field I follow so I can believe ones exists. - and I don't have the Physics to follow the is-it/isn't-it debate over DWave's stuff.
]]>As for the different player races, you've got people who are intelligent smart dust linked together by quantum cryptography (the Zoku), who take human form and remind me strongly of elves for some reason (perhaps because of their MMORPG culture), people who were uploaded into computers, loaded into robot slave bodies used to sort of terraform Mars, and later given life in more-or-less human bodies, in a system where the currency is lifespan units (The Oubliette), and S=1 (or possibly S=2) transapients straight out of Orion's Arm (http://www.orionsarm.com) running around playing gods (The Sobornost). And a few others. There's actually a Wikipedia glossary of terms to sort out the players and the played.
]]>Read something yesterday. Looks like they're working on it in Australia. Something called a Boson Sampling computer, kind of a hybrid.
http://news.yahoo.com/computer-bridges-classical-quantum-computing-175759146.html
]]>Speculating, I think the Zoku are quantum entities - 'quantum filth' - and necessarily use quantum encryption, quantum entanglement and quantum computing. All things quantum really. And as I don't really understand quantum stuff that all does sound a bit magical.
The Sobernost aren't trans-sapient, I think. Their attempts to achive trans-sapiense lead to disaster: Dragons. They can speed up, hand off tasks to instances and gogols but they aren't trans-sapient.
In my opinion.
]]>actually, an entangled nethack where you could superimpose Marvin on your own playing style, and derive a game where you ascended - with highlights of the game - without needing to spend time kicking beholders over fountains - mmm, that would be a useful contribution for those crazy quantum mechanists. Better than random teleportation, anyway...
[the Marvin who ascends 50% of the time he plays, not the Marvin who waits for all eternity for his two-headed friend]
]]>(Flashback alert) Way back in the day, it wasn't Space Invaders, it was Microsoft Flight Simulator. And it wasn't a test of programming ability, it was a test of whether a particular PC clone was fully compatible... I'll just mention 640K once or twice now...
Strangely, specialised sums (even in radar signal processors) are often the easy bit. Granted, it can be fun tracking truncation errors and avoiding overflows when doing fixed-point arithmetic without benefit of standard libraries (or operating systems), but it's not that hard, IMHO.
What is hard is coming up with a model that allows you to map the real world to the abstract one you're writing. And then map back the answer - for every possible set of inputs... Every single time you say "but if this happens, then..." the number of paths through your system just doubled. If you forget about one particular combination of inputs, and your system doesn't cope, then you have a bug. If someone in marketing has a bright idea or a conveniently short memory, and declares "but we wanted it in yellow"...
The fun comes when "possible inputs" isn't just a static set of values defined at the start, it's when it varies with time, or it turns up in a different order, or there's some network latency, or a file that used to exist is now missing because the user decided (for whatever reason) to delete it or move it. You have to give meaningful error messages, you have to operate within a reasonable memory footprint and a reasonable response time, you shouldn't crash, and you should do all of this in such a way that you can prove it works to an acceptable level (testability) and that in a year's time you can still understand how you did it should it need to change (maintainability).
I've seen ...supposedly professional programmers... (thinks of the happy place) who thought that it was OK to just add clauses to a function until it was 600 lines of C++, containing 2^30 possible paths through the one function. Throw in multiple exit points, absent error handling, and some leaked memory, and watch me howl.
Not to worry. Flame wars have nothing on the holy wars that can erupt when someone in any organisation starts talking about creating a set of programming guidelines...
]]>Incidentally, I have one piece of software where a single sub-program contains 2_000 (two thousand) lines of Ada, which is clearly far too many, but most of them are already logic structure or sub-program calls. I have tried to simplify it, but reached the conclusion that I'd be creating sub-programs rather than removing multiple copies of the same lines of code.
]]>Byt "the singularity", I assume you actually mean "the next singularity", & that it will involve at least one form of AI though whether you are talking "self-aware" is unclear. Let's just hope it's not a Fredric Brown Answer -type answer to your question, then?
And, of course, it WILL be like powered flight or submarines - it'll fly, but it won't be, or look like, or behave like a bird: ... - it'll swim, but it won't be, or look like, or behave like a fish: ...