David Smith made this video a year ago, showing how you could have:
virtual world objects automatically populated by real world objects;
scripted behavior for those interactive objects that:
gives realtime display of real world data associated with those objects;
allows you to control the associated real world objects (like Swayze in “Ghost”);
all while functioning in a standard virtual world in which the participants can communicate with voice/video/text/gesture and spontaneously share apps, etc.
I don’t know why I failed to post this when it first came out. I think maybe I wanted to see how it would play out. Everthing shown was written in the widely used Python scripting language in a way that is added to the system by end-user/programmers, rather than being built into the system by the original developers. Would anyone actually do that? Would anyone use in-world computer screens to interact with external real world programs?
Well, the panels and the programming interface have had a year to mature, and we now have multiple government agencies and multiple big oil using it for their own operation centers. I’ve never seen most of it and can’t show it, so this remains the only video I can show of an operation center. There are some nice descriptions of portions of the Navy’s sub training environment, but no video. The stuff that is written and public can give you a feel what isn’t.
Howard Stearns works at High Fidelity, Inc., creating the metaverse.
Mr. Stearns has a quarter century experience in systems engineering, applications consulting, and management of advanced software technologies. He was the technical lead of University of Wisconsin's Croquet project, an ambitious project convened by computing pioneer Alan Kay to transform collaboration through 3D graphics and real-time, persistent shared spaces. The CAD integration products Mr. Stearns created for expert system pioneer ICAD set the market standard through IPO and acquisition by Oracle. The embedded systems he wrote helped transform the industrial diamond market. In the early 2000s, Mr. Stearns was named Technology Strategist for Curl, the only startup founded by WWW pioneer Tim Berners-Lee. An expert on programming languages and operating systems, Mr. Stearns created the Eclipse commercial Common Lisp programming implementation.
Mr. Stearns has two degrees from M.I.T., and has directed family businesses in early childhood education and publishing.
Cool. I submitted this to /.
It turns out the Air Force is disclosing some stuff, too: http://www.keesler.af.mil/n…
Awww, Johnny, I can see your mind turning already…. The Hunt for Red Leader 5. Do Avatars Dream of Electric Murders? ….
It happens that I work in a real operations center and can go touch the real machines if I so desire, but rarely do. Our operators rarely touch the real machines either. Mostly they watch status monitoring displays and consoles for various machines and applications. A few years ago, as an experiment I tried to build a virtual operations center with these various displays in Croquet, but it didn’t go very far. I can see how far beyond Croquet you have gotten and it’s pretty cool. The use case videos you have posted lately are very interesting – thanks for sharing!
And these have all been kind of old, too. It was time to clear out some old content laying around since we were about to change our name.
The Python integration has a nice model that I’ll have to write up. It’s sort of a third way that is yet-again different from both the Croquet replication model and the media model, but uses both. I guess the short version is that there is an off-island resource that is they program, which gets downloaded to one machine to run, thus allowing the use of standard Python libraries without worrying about getting bit-identical results — only one computer is running the program. This also allows that one computer to communicate (through Python) with the real world at large. Then the resulting data and events gets replicated to the other participants.