I’ve been working with some test harnesses for our Croquet worlds. It’s been a real pain working outside of Croquet: getting things to happen across multiple platforms. Moving data around. It’s all so much easier in a virtual space that automatically replicates everything.
Anyway, we finally got it working enough that there are several machines in Qwaq’s Palo Alto office that are all running around as robots in a virtual world, doing various user activities to see what breaks. Being (still!) in Wisconsin, I have to peek on these machines via remote. I’m currently using Virtual Network Computing (VNC), but there’s also Windows Remote Desktop (RDP). These programs basically scrape the screen at some level, and send the pictures to me. So when these robots are buzzing around in-world, I get a screen repaint, and then another, and then another. And that’s just one machine. If I want to monitor what they’re all doing, I have to use have a VNC window open for each, scraping and repainting away. Yuck. If only there were a better way….
Well, these robots are all in a dedicated “Organization” called “Tests”. All I have to do is sign in to Tests with the normal Qwaq Forums client directly from my own home machine. I take a birds-eye position in the sky (with one press of the “End” key) and can watch all the robots do their thing in real time. There’s even a panel that lists all the (robot) users in the org and tells me if they are in a different space in the Organization, and puts up a little exclamation mark if they are being unresponsive. And of course, the graphics are all smooth.
How cool is that?
This is a nice illustration of the difference between the two approaches for distance collaboration:
VNC: I run a dumb client to see a slow-rate frame rate view of the whole desktop of each of the N individual remote machine. I have to manage N 2D windows, and each shows what that machine shows. I can’t change my observational perspective, nor participate in the test itself. There are N pairwise traffic connections. Each observed robot has to run additional software (a VNC server) and I have access to the whole machine on which it runs.
Virtual worlds: I run a smart client with one low-traffic connection that uses my graphics card to give a high interactive frame rate. I can independently choose my observational viewpoint and can interact with the robot participants. There’s nothing else needed on the robot machines, and I don’t get access to those machines – just access to the shared/common virtual world.
Related Posts:
- My Insanely Long Field Guide To The C-Band Spectrum Fight, And Why This Won't End In December. by Harold November 13, 2019 Like most everything else at the FCC these days, problems that have relatively simple and straightforward solutions turn into horrible complicated messes. Take the C-Band,…
- Auctioning a Chunk of 6 GHz Would be Phenomenally Bad Policy. by Harold March 4, 2020 Spectrum has once again become a hot topic in telecom. And in what is perhaps the oddest twist in this season's telenovela Spectrum Wars is…
- Get Ready for the 2022 Season of Spectrum Wars! by Harold March 15, 2022 It isn't the sultry Regency drama of Bridgerton, the action psycho-drama of Moon Knight, or even the, um, whatever the heck Human Resources is. But…
- Political Advertising In Crisis: What We Should Learn From the Warren/Facebook Ad Flap. by Harold October 17, 2019 [This is largely a reprint from a blog post originally posted on the Public Knowledge blog.] The last week or so has highlighted the complete…
- Breaking Down and Taking Down Trump's Executive Order Spanking Social Media. by Harold June 4, 2020 (A substantially similar version of this appeared first on the blog of my employer, Public Knowledge) It's hard to believe Trump issued this stupid Executive…
- Mozilla v. FCC Reaction, or Net Neutrality Telenovela Gets Renewed For At Least Two More Seasons. by Harold October 7, 2019 I've been doing network neutrality an awfully long time. More than 20 years, actually. That was when we started arguing over how to classify cable…
About Stearns
Howard Stearns works at High Fidelity, Inc., creating the metaverse.
Mr. Stearns has a quarter century experience in systems engineering, applications consulting, and management of advanced software technologies. He was the technical lead of University of Wisconsin's Croquet project, an ambitious project convened by computing pioneer Alan Kay to transform collaboration through 3D graphics and real-time, persistent shared spaces. The CAD integration products Mr. Stearns created for expert system pioneer ICAD set the market standard through IPO and acquisition by Oracle. The embedded systems he wrote helped transform the industrial diamond market. In the early 2000s, Mr. Stearns was named Technology Strategist for Curl, the only startup founded by WWW pioneer Tim Berners-Lee. An expert on programming languages and operating systems, Mr. Stearns created the Eclipse commercial Common Lisp programming implementation.
Mr. Stearns has two degrees from M.I.T., and has directed family businesses in early childhood education and publishing.
Wow.
This is almost, like, um, *computer science*.
Cool beans.
jrs
It was kind of cute when I showed this to our QA team. Because I was in Wisconsin and they in California, I joined them in in-world. We spoke by voice and video as I showed them the conventional Web-based UI for starting tests and examining the archived results. The Web browser was running on a wall of the in-world test center.
But when I pushed the button to start the tests, the robots started operating all around us. We were standing and talking _IN_ the test.
On reflection six months later…
There’s a lot of things I do as a programmer that I don’t use as a user. The test environment is an example where I have found it beneficial to be a user of a virtual world as a convenient way to experience what I had developed.
I wonder if virtual worlds have as a general property that they tend to push developers into being actual users.