Report on the Dorkbot Meeting of 6 july

July 7, 2008 at 9:23 pm 3 comments

The 6th Edition of Dorkbot Second Life was very very interesting!
JJ Ventrella presented Avatar Puppeteering, and Philippe Bossut told us all about the HandsFree3D project, and his work on connecting this 3D camera tracking to the puppeteering feature.
Read the Meeting Announcement for more information and bio’s of the presenters
Over 70 people showed up, so obviously there is a lot of interest for these technologies.
A number of people wanted to know how they could try it, and get involved in helping these technologies to fruition. Therefor I offered to make a puppeering client available, and organize a workshop in the near future.
To this purpose I created the group “Avatar Puppeteers”. It’s open to join for anybody who wishes to contribute to further development and uasage of Avatar Puppeteering.
Though things were kept simple using only video and text chat, the meeting was probably the most chaotic one we ever had! Anticipating on the huge interest for this topic (who doesn’t want more expressive avatars and more fluid ways to interact?) I had called in the help from a skillfull collegue organizer, but unfortunately his laptop broke down shortly before the meeting! Oh well, I now learned that you need 4 people to keep an event like this under control, and it was very nice to see Sugar Seville’s seats filled with people, and that Odyssey, home of the dorkbot place can handle this πŸ™‚
A lot was learned, connections were made, and groups were joined. Huray!

So here comes the chat transcript (heavily edited for readability). And some snapshots by Fau Ferdinand and Bea Fontani.

[13:04] Evo Szuyuan: Welcome to the 6th Dorkbot Second Life meeting!
[13:04] Evo Szuyuan: Dorkbot ( refers to a group of affiliated organizations
[13:04] Evo Szuyuan: worldwide that hold informal meetings of artists, engineers, and designers [13:04] Evo Szuyuan: working in the medium of electronic art.
[13:05] Evo Szuyuan: It’s motto is ‘people doing strange things with electricity’. For Dorkbot Second Life, this theme is slighlty modified to ‘people doing strange things with Second Life’.
[13:05] Evo Szuyuan: We want to explore the new possibilities of the metaverse and try to establish a platform for a glocal way of creating new connections and collaborations.
[13:05] Evo Szuyuan: If you’d like to be informed on future meetings and news, join the ‘dorkbot SL ectronic art meetings’ group.
[13:07] Evo Szuyuan: I’m very happy to introduce you now to the main developers of 2 very exciting projects.
[13:07] Evo Szuyuan: Ventrella Linden (JJ Ventrella in RL), who you may also know as the man behind the virtual world There, flexi prims and follow cam (and much more) will present Avatar Puppeteering.
[13:07] Evo Szuyuan: Roger Fullstop (Philippe Bossut in RL), who works as an β€œEntrepreneur in Residence” at Kapor Enterprises Inc (KEI) is going to tell us about Segalen/HandsFree3D.
[13:08] Evo Szuyuan: (please go to for their complete bio and many interesting links of what they have been up to).
[13:08] Evo Szuyuan: JJ and Roger will both do a 20 minutes presentation.
[13:08] Evo Szuyuan: After each presentation we have about 5 minutes time for questions.
[13:08] Evo Szuyuan: After both presentations we will have a general discussion
[13:09] Evo Szuyuan: so I will now give the word to Ventrella Linden!

[13:09] Ventrella Linden: Thanks Evo!
[13:09] Ventrella Linden: Hi
[13:09] Ventrella Linden: I worked at Linden Lab for 2 years.
[13:09] Ventrella Linden: I’m passionate about creating intuitive motion.
[13:10] Ventrella Linden: While I was at Linden Lab, I did 3 main things:
[13:10] Ventrella Linden: and Puppeteteering
[13:10] Ventrella Linden: FollowCam,
[13:10] Ventrella Linden: Flexies….
[13:10] Ventrella Linden: More info about puppeteering at
[13:10] Ventrella Linden: Let’s see the first movie!

Pose Movie

[13:11] Ventrella Linden: Here’s an example of manipulating a physical avatar.
[13:11] Ventrella Linden: ok..

[13:12] Ventrella Linden: Normal avatar animation uses a local coordinate system.
[13:12] Ventrella Linden: I call it the “lonely coordinate system”.
[13:12] Ventrella Linden: The goal of Puppeteering is to put avatars and users into a common space.
[13:12] Ventrella Linden: Avatar-to-avatar, and user-to-avatar manipulation is the ultimate goal.

[13:13] Ventrella Linden: The first two goals of physical avatar were:
[13:13] Ventrella Linden: (1) real-time expression
[13:13] Ventrella Linden: (1) real-time expression
[13:13] Ventrella Linden: (2) free-form posing to make animations
[13:13] Ventrella Linden: next image…
[13:14] Ventrella Linden: How many times have you wanted to just nudge your avatar to a specific pose?

[13:14] Ventrella Linden: The physical avatar takes the normal avatar skeleton add adds a few things.
[13:14] Ventrella Linden: For instance, it adds 5 “end effectors”.

[13:14] Ventrella Linden: It also uses spring physics for the “bones”.
[13:15] Ventrella Linden: The springs hold the joints together.
[13:15] Ventrella Linden: (opposite of nature)

ragdoll movie

[13:15] Ventrella Linden: This movie shows “rag doll mode”.
[13:15] Ventrella Linden: That’s basically physical avatar with gravity turned ON.
[13:16] Ventrella Linden: BTW, to see a cute example of ragdoll physics…
[13:16] Ventrella Linden: Google “Ventrella”, and “Peanut”.
[13:16] Ventrella Linden: OK – done with plug…

[13:16] Ventrella Linden: A few extra springs are added to hold the chest and pelvis regions together.
[13:17] Ventrella Linden: These form tetrahedra – one of Bucky Fuller’s favorite shapes!
[13:17] Ventrella Linden: Tensegrity – one cool concept.
[13:17] Man Michinaga: for sure, also good work on it in teammanagement
[13:17] Ventrella Linden: Evo’s friend deJong knows this.
[13:17] Ventrella Linden: Roll the next movie!

tets video

[13:17] Ventrella Linden: This move shows the internal skeleton springs – forming tetraherda.

[13:18] Ventrella Linden: The end effectors serve a specific purpose. Let me explain…

[13:19] Ventrella Linden: Standard skeletal animation has an “outward” flow:

[13:19] Ventrella Linden: Parent joints affect the locations of child joints.
[13:19] Ventrella Linden: …your basic hierarchical animation scheme.
[13:20] Ventrella Linden: next image (the inward diagram)

[13:20] Ventrella Linden: Physical avatar does the opposite. It takes the position of a joint…
[13:20] Ventrella Linden: …and calculate the rotation of its parent.
[13:20] Ventrella Linden: And so when you manipulate the avatar’s finger-tip (an end-effector)…
[13:20] Ventrella Linden: It rotates the wrist joint.
[13:20] Ventrella Linden: It’s sort of a variation of inverse kinematics – using physics.
[13:20] Ventrella Linden: Next movie please!

constraints movie

[13:21] Ventrella Linden: I also added a number of “humanoid constraints”.
[13:21] Ventrella Linden: These keep things from bending the wrong way – like knees – (ouch!)

[13:21] Ventrella Linden: Physical Avatar was made to make it easy to alter the code to…
[13:21] Ehdward Spengler: this system would be even more useful if it could be applied to objects
[13:21] Ventrella Linden: allow residents to….
[13:21] Ventrella Linden: pose each other!
[13:21] Ventrella Linden: (not suitable for teens πŸ™‚
[13:22] Ventrella Linden: next image…

[13:22] Ventrella Linden: That;s all folks!
[13:22] Ventrella Linden: Questions?
[13:22] Osprey Therian: Can a series of moves be recorded and played back, so to speak?
[13:23] Ventrella Linden: hmmm- Aura Linden..
[13:23] Ventrella Linden: worked on the part where…
[13:24] Ventrella Linden: you’l be able to save poses, etc.
[13:24] Osprey Therian: Would it be possible, in the future, to make bvh files this way?
[13:24] Ventrella Linden: I don’t know the status right now.
[13:24] Man Michinaga: We saw mouse manipulation
[13:24] Man Michinaga: but arethere going to be resources for external data input, like arduino, XML etc.
[13:25] Man Michinaga: can we take external input and apply it to this from device streams?
[13:25] Ehdward Spengler: no but you can have a free STFU card
[13:25] Aminom Marvin: STFU stands for Standardized Text Function Unwrapper. Through implimentation of STFU user experience increases 200%
[13:25] Aminom Marvin: is there anything like puppeteering, hierarchal groups, or anything for prims or sculpted prims planned that is animation related?
[13:25] Ehdward Spengler: yes, could this be applied to objects? like build your owen bone system, that would be great
[13:25] KRYSTAL Bleac: is this right and does it stay there?
[13:25] KRYSTAL Bleac: euh i saw on one of the pics the balls where making shadow on the ground…
[13:26] Ventrella Linden: yes – shadow
[13:26] KRYSTAL Bleac: those where shadows of the balls?
[13:26] Ventrella Linden: Those shadows were created by my physical avatar joints
[13:27] Ventrella Linden: they reflect pretty accurately
[13:27] Ventrella Linden: the location of the body above ground
[13:27] Shava Suntzu: Havoc/Euphoria? Will we ever have “natural” interactions with physical objects/landscape?
[13:27] Ventrella Linden: Euphoria is miles above this…and prolly too high end.
[13:27] Ayumi Cassini: is there any way we can test it ourselves?
[13:28] Political Magic: Is this technology in place now? Or is this a proposal?
[13:28] McGroarty Playfair: At present, the viewer with these features is in the public subversion, and there is one sim on the beta grid with the sim side support. You can see the features without needing to be on that sim, though.
[13:29] Ventrella Linden: Roger can explain more the current status of the code.
[13:29] Roger Fullstop: There’s a version of the “Puppeteering branch” that has been made public by Jake Linden on sldev
[13:30] Roger Fullstop: and it’s limited to local puppeteering (client side only)
[13:30] Roger Fullstop: I spent the most part of the last month cleaning up and fixing bugs
[13:31] Roger Fullstop: I logged a JIRA on this and will post a patch soon
[13:31] McGroarty Playfair: Info about the puppetteering branch – what it is, where to get it:
[13:31] Roger Fullstop: the code is open source
[13:31] Roger Fullstop: you can pick it up now
[13:32] Ventrella Linden: Open Source, y’all.
[13:32] Ayumi Cassini: is there a compiled version somewhere?
[13:32] Roger Fullstop: not that I know off
[13:32] Ayumi Cassini: 😦
[13:33] McGroarty Playfair:
[13:33] Evo Szuyuan: I’d be happy to organize a puppeteering workshop and downloads of a puppeteering client
[13:33] Evo Szuyuan: but let’s talk more about this after Roger speaks..
[13:33] Artm Udal: q: how do you actually controll the avatar?
[13:33] Artm Udal: say i wanna make a pose…
[13:33] Roger Fullstop: Ventrella implemented a “Ctrl” UI where you can select a joint and move it
[13:33] Roger Fullstop: but that’s joint by joint of course
[13:34] Roger Fullstop: that’s where the 3D camera will play a role (I hope)
[13:34] Feynt Mistral: Take it away Roger. >)
[13:34] Man Michinaga: yes – 3d cam
[13:35] Roger Fullstop: May be it’s time to start on the camera part of the talk?
[13:35] Ventrella Linden: cool.
[13:35] Evo Szuyuan: yes!
[13:35] Evo Szuyuan: Shall we start with a movie?
[13:35] Roger Fullstop: please do

[13:38] McGroarty Playfair: Philippe: Any special markers for that?
[13:38] Roger Fullstop: Nope
[13:38] McGroarty Playfair: Very cool!
[13:38] Roger Fullstop: it’s markerless mocap
[13:39] Saijanai Kuhn:
[13:40] Artm Udal: what’s a 3d camera?
[13:40] McGroarty Playfair: Artm: It’s a camera with implants.
[13:40] Roger Fullstop: the camera is a 3DV Systems camera
[13:40] Roger Fullstop: it’s a prototype
[13:40] Roger Fullstop: not commercially available yet
[13:41] Paul Bumi: expensive?
[13:41] Roger Fullstop: according to 3DV and other manufacturers, those cameras will be priced around $100
[13:41] Artm Udal: what exactly does it see?
[13:41] Roger Fullstop: give or take some…
[13:41] Saijanai Kuhn: wow that’s quite impressive, that’s for the low-end I would imiagine
[13:42] Evo Szuyuan: basically it’s video tracking artm
[13:42] Evo Szuyuan: it’s called 3d because it also sees depth
[13:42] Artm Udal: like the thing in wii?
[13:42] Feynt Mistral: Not quite.
[13:42] Artm Udal: stereo?
[13:42] Evo Szuyuan: sometimes it’s done with 2 cameras
[13:42] Roger Fullstop: correct evo
[13:42] Feynt Mistral: It’s like sonar, only with lasers.
[13:42] Roger Fullstop: no, the wii tracks an active device
[13:42] Saijanai Kuhn: wii tracks two infrared beams, not normal visual input
[13:42] Roger Fullstop: this camera records an RGB and an IR video
[13:43] Artm Udal: what does this camera tracks? markers?
[13:43] Roger Fullstop: the IR is used to computed the depth channel
[13:43] Feynt Mistral: Infrared light is emitted by the LEDs around the camera, and they bounce off of the person.
[13:43] Anecdote Allen: it is like a simple motion capture system
[13:43] McGroarty Playfair: It pulses IR and looks at the intensity of what’s reflected back?
[13:43] Feynt Mistral: The light then bounces back to the camera, and it reads the intensity of the light.
[13:43] Roger Fullstop: Feynt: yes, that’s what this camera (3DV) does
[13:43] Artm Udal: ah i see
[13:44] Feynt Mistral: Light would be brightest off of surfaces that face the camera.
[13:44] Roger Fullstop: they use what they call TOF (Time Of Flight) tech
[13:44] Saijanai Kuhn: could make it more accurate by using 2 or more different frequencies
[13:44] Feynt Mistral: So in effect, it’s sonar, only with light.
[13:44] Artm Udal: nice. how heavy is it?
[13:44] Ventrella Linden: So, how you light the subject matter, right?
[13:44] Neo692 Beck: very impressive….
[13:44] Saijanai Kuhn: normal visual uses normal lighting. THe IR would be self-lighting
[13:44] Roger Fullstop: the camera includes the lighting Ventrella
[13:45] Ventrella Linden: Ah. cool.
[13:45] Ventrella Linden: straight-on lighting.
[13:45] Saijanai Kuhn: so it self-lights the RGB also?
[13:45] spinster Voom: does it need a lot of bandwidth?
[13:46] KRYSTAL Bleac: spinster Voom: nope you just puch key by using a program on your pc
[13:45] Neo692 Beck: yes Saijanai
[13:45] Roger Fullstop: note that other manufacturers of 3D (depth) cams are coming with alternative technologies
[13:45] Feynt Mistral: It could theoretically be foiled by having a lot of IR light from lighting in the room.
[13:45] McGroarty Playfair: Camera with z-buffer makes me excited and fidgety 3
[13:45] Feynt Mistral: Like bay windows.
[13:45] Artm Udal: we use the stereo one. but its depth is very noisy
[13:46] Roger Fullstop: Feynt: they have some ways to eliminate the ambient IR
[13:46] Feynt Mistral: Oh neat.
[13:46] Feynt Mistral: Probably a frequency deal.
[13:46] Roger Fullstop: but with my tests, indeed, glass panels behind the camera create lots of noise problem
[13:46] McGroarty Playfair: Possibly record frames with and without the local IR light, and compare frames.
[13:46] Evo Szuyuan: shall we go on with the next movie now that we know what a 3d cam is ?
[13:46] Roger Fullstop: sure, please do

[13:49] Artm Udal: the fist is to distinguish left and right hand?
[13:50] Roger Fullstop: Artm: sort of
[13:50] Artm Udal: coz at some point he had left hand to the right of the right πŸ™‚
[13:50] Fau Ferdinand: it would be great to get out of the chair
[13:50] KRYSTAL Bleac: it looks fun but i don’t know if its easyer then a mouse to build with…
[13:50] Feynt Mistral: Wouldn’t it make more sense to implement a system that makes more use of your hand gestures?
[13:51] Feynt Mistral: Like, moving your hands apart to scale, or moving in a circular motion to rotate?
[13:51] Paul Bumi: I like to move things fast and precisely when building.. this looks fun to do but anything but fast and precise
[13:51] Roger Fullstop: agree but my point was to show that I could do fine control of the hand tracking
[13:51] KRYSTAL Bleac: ok
[13:51] Roger Fullstop: indeed, a *real* implementation would require to develop a specific interface
[13:52] Roger Fullstop: one that takes advantage of having your *hands* on the object
[13:52] Saijanai Kuhn: tactile feedbck is a ways off, however… πŸ˜‰
[13:52] Saijanai Kuhn: 3D camera with tactile feedback…
[13:52] Feynt Mistral: Nuts.
[13:53] Tickled Pink thinks of fingers on iphone screen being adapted for an SL input device. hmmmm.
[13:53] paulie Femto: Novint makes a haptic controller (the Falcon) which is available today and doesnt cost a fortune.
[13:53] Man Michinaga: I am very much for the possibility of readily available full haptic worlds
[13:54] Osprey Therian: /I’m always heartened to see more projects that increase our possibilities. I
think we don’t quite understand how constrained we are atm.
[13:53] Ventrella Linden: In my opinion,
[13:53] Ventrella Linden: the whole virtual worlds movement will benefit from this more kinematic input.
[13:53] Ventrella Linden: Better for your health.
[13:53] Roger Fullstop: Ventrella: agree with you
[13:53] Roger Fullstop: this is why I started toying with puppeteering 2 months ago
[13:54] Roger Fullstop: I have a version with puppeteering with the cam
[13:54] Roger Fullstop: but I need to fix some scaling issues (scale per bone)
[13:52] Artm Udal: does the camera sees like two spots for hands? or can it distinguish fingers?
[13:53] Artm Udal: you could chat in sign language then
[13:53] McGroarty Playfair: Shot for making gang signs. I was only trying to build a torus. 😦
[13:53] Feynt Mistral: Yes, on that note, Ventrella can you talk to someone about giving us finger control for sign language?
[13:54] Saijanai Kuhn: There was a demo a couple of years ago of taking standard sign language notation and converting it to animation.
[13:54] Saijanai Kuhn: you could do the opposite. Take text input and convert it to sign language
[13:54] Ventrella Linden: Roger and I discussed briedly…
[13:54] Ventrella Linden: finger and hands.
[13:54] Ventrella Linden: Physical avatar adds a few joints.
[13:55] Ventrella Linden: It could add more, and then…
[13:55] Ventrella Linden: you’d have hand skeletons.
[13:55] Ventrella Linden: (not that it’ll be easy, but…)
[13:54] Evo Szuyuan: Roger.. can you tell us about the reaction of the people who tried it?
[13:55] Roger Fullstop: Evo: everytime I have people using it, they love the feeling of doing something with their whole body
[13:55] Roger Fullstop: it’s much more engaging
[13:55] Roger Fullstop: I need to allow doing this sitting though
[13:56] Roger Fullstop: as lots of people asked for it
[13:56] Roger Fullstop: for the moment, my tracking SDK works better in a standing position though
[13:57] Aymeric Yalin: how about puppetering jaws and cheeks, and have the face move as we speak, or smile ?
[13:57] McGroarty Playfair: Cool thing about middle mouse button for push to talk is you can still pick up the mouse and gesticulate.
[13:58] Roger Fullstop: Anyway, I’m just starting for the moment, there’s loads of problems to solve for sure (hands, gaze, etc…)
[13:58] Hypatia Callisto: its exciting πŸ˜€
[13:58] McGroarty Playfair: Gaze would be wonderful
[13:58] Man Michinaga: for sure
[13:58] Roger Fullstop: Aymeric: face animation is really exciting and I’ve looked at a couple of technos for this
[13:59] spinster Voom: unintentional movements could be a problem? fidgeting etc?
[13:59] Roger Fullstop: It looks like depth won’t help much though
[13:59] Hypatia Callisto: my world for facial animation
[13:59] Roger Fullstop: the RGB stream will be better
[13:59] Roger Fullstop: and there’s tons of literature on the subject
[13:59] Saijanai Kuhn: yeah, google MPEG BAP OR FAP
[13:59] McGroarty Playfair: Roger: Please please do push for a developer release, not just holding out until this is all done before release.
[14:00] Roger Fullstop: McGroarty: I hear you but, right now, I’s linking statically with cameras SDKs that are very GPL unfriendly
[14:00] Roger Fullstop: I need to solve that before releasing something
[14:00] Roger Fullstop: also, remember that the cameras are not yet available…
[14:01] Feynt Mistral: Not a problem really, Roger. I’m sure we could come up with our own IR cameras. >3
[14:01] Feynt Mistral: Those of us who would bother with compiling the client would probably be able
to jury rig such a device.
[14:01] Saijanai Kuhn: what does the protocol look like from client to server?
[14:01] Roger Fullstop: Saijanai: all you’ve seen so far doesn’t require any change to the server side (navigation I means and editing)
[14:02] Saijanai Kuhn: well, you’re sending something from Client to server that isn’t part of the usual stream, arent you?
[14:02] Roger Fullstop: puppeteering does but LL has some work done on that
[14:03] Ventrella Linden: more questions?
[14:03] Evo Szuyuan: yes ! many more πŸ™‚
[14:03] Roger Fullstop: shoot
[14:03] Evo Szuyuan: to go back to an earlier question of artm about puppeteering..
[14:04] Evo Szuyuan: how cpu intesive is it?
[14:04] Ventrella Linden: The viewer side shouldn’t be too bad.
[14:04] Ventrella Linden: Could be optimized.
[14:04] Artm Udal: especially when there are many avatars next to each other bumping
[14:04] Ventrella Linden: Server side is a different domain.
[14:04] Man Michinaga: right
[14:05] McGroarty Playfair: Man: At present, LL has indicated no definite plans for taking it forward. If the community does something interesting that could easily change.
[14:05] McGroarty Playfair: Man: You can subscribe to the sldev mailing list or look through the archives to see discussion on that.
[14:06] Evo Szuyuan: right now you can compile a viewer, but server side it’s only available on 1 sim on the beta grid
[14:07] Ayumi Cassini: which sim?
[14:07] Evo Szuyuan: puppeteering sim
[14:07] Osprey Therian: /I took that to mean if the community did things to get it rolling there might be specialised viewers.
[14:07] Evo Szuyuan: yes osprey..
[14:07] Osprey Therian: and we could prod them into doing the serbverside bit
[14:08] Osprey Therian forms a guerrilla ragdoll underground army
[14:07] Evo Szuyuan: if people are up to it, it would be great to see what we can do now, and streamline some thoughts
[14:08] Roger Fullstop: would be great to have the server side supported by LL indeed
[14:08] McGroarty Playfair: Ayumi: You can use the features without being on an enabled sim as well – it’s just that only you see the results.
[14:08] Saijanai Kuhn: just wondering at how it is streamed to the server (AWG territory there)
[14:09] McGroarty Playfair: Saijanai: There are just a couple extra messages. You should be able to isolate those easily.
[14:09] IntLibber Brautigan: a single pose tends to consume three emails of data
[14:10] Saijanai Kuhn: KK. You know we’re going http for almost everything. Seems that http might be
more reliable but not sure if its suitable for such a continuous stream of data
[14:05] paulie Femto: questions on the camera: who developed it? Did you guys build it yourselves? What’s the resolution and would improving the res add anything?
[14:05] Roger Fullstop: Femto: there are a bunch of camera manufacturers. I currently developed my
code to support 2 of them
[14:06] Roger Fullstop: (I got prototypes for)
[14:06] Roger Fullstop: The resolution of the camera is 640×480 (the proto, the commercial ones are supposed to go higher)
[14:07] Saijanai Kuhn: I think you’d get wider initial adoption with a keyboard interface (computer and/or MIDI)
[14:08] Ventrella Linden: any other Qs?
[14:08] Evo Szuyuan: what would it take to implement script functions for the puppeteering?
[14:09] Roger Fullstop: Haven’t thought about that at all
[14:09] Ventrella Linden: That was one of our original ideas – and I was never able to implement that
– but
[14:09] Ventrella Linden: it would be really empowering.
[14:09] Roger Fullstop: I’m puzzled. Example of what you have in mind Ventrlla?
[14:10] Ventrella Linden: So for instance,
[14:10] Ventrella Linden: instead of the mouse cursor moving joitns in Physical Avatar,
[14:10] Ventrella Linden: we design scripts that allow LSL to move them-
[14:10] Ventrella Linden: using various high-level commands.
[14:11] Feynt Mistral: I honoustly don’t see what the issue would be if you treated the end effectors
like attachments, allowing you to set their positions.
[14:11] Feynt Mistral: llMoveJoint([LEFT_WRIST, + llGetPos]);
[14:11] Hypatia Callisto salivates over lsl control
[14:11] Ventrella Linden: Physical Avatar code doesn’t care what’s moving its joitns.
[14:11] Roger Fullstop: easier than the current animation files… interesting…
[14:11] Evo Szuyuan: and say you want to combine this with motion like walking ?
[14:11] Ventrella Linden: Conbine with walking…
[14:11] Ventrella Linden: once of the more complex things about this is that…
[14:11] Ventrella Linden: it tried to make traditional ava animation get along with…
[14:12] Ventrella Linden: physically-based, forward dynamics.
[14:12] Ventrella Linden: Not easy.
[14:12] Ventrella Linden: But do-able.
[14:10] Saijanai Kuhn: the BAP/FAP rotocol is pretty compressed and looks to implement the entire
system you’ve shown Wondering if that should be standardized on
[14:12] Saijanai Kuhn: facial animation in MPEG-4
[14:14] Saijanai Kuhn: body animation for MPEG-4
[14:13] Evo Szuyuan: roger.. what kind of usage do you see for your technology?
[14:13] Roger Fullstop: Evo: first, we’d like to make meetings in SL more natural, with gesture and body language becoming part of it
[14:14] Rober1236 Jua: well Cutlure produces gestures
[14:14] Roger Fullstop: that would improve the attractivness and efficiency of meetings in world
[14:14] Rober1236 Jua: gestures that are automatic that we don’t have to think about
[14:14] Ventrella Linden: I’m not looking at the audiesnce. For example.
[14:14] Rober1236 Jua: you would want to animate these gestures not have to run them
[14:14] Ventrella Linden: Not enough body language to make it worth the low framerate.
[14:15] Roger Fullstop: exactly, right now, meeting is IM with a nice backgrounf
[14:15] Rober1236 Jua: and VOIP
[14:15] Rober1236 Jua: and file exchanges and email and other things
[14:15] Evo Szuyuan: true
[14:15] Roger Fullstop: we don’t really take advantage of the 3D aspect of the world, i.e., relating to others in a 3D space
[14:15] Ventrella Linden: Roger…
[14:16] Ventrella Linden: does the puppeteering work correctly for you – in “local mode”>
[14:16] Roger Fullstop: err… after some debugging, yes πŸ™‚
[14:16] Ventrella Linden: I recall the server interpretation causing joints to be way off.
[14:17] Roger Fullstop: at the beginning, I crashed after one puppeteering session πŸ™‚
[14:17] Ventrella Linden: I’m hoping that it is still looking right in local mode.
[14:17] Roger Fullstop: Ventralla: the issue of joint being way off happened also in local mode
[14:17] Ventrella Linden: curious why.
[14:18] Roger Fullstop: that had to do with the init (getting the copy of the joint position) from LLVOAvatar not being done at the right time
[14:18] Roger Fullstop: under some conditions
[14:18] Ventrella Linden: yea – that sounds right.
[14:18] Ventrella Linden: LLVOAvatar – ah – memories.
[14:18] Roger Fullstop: Yeah, I had to track the whole flag state logic and rewrite it from scratch basically
[14:19] Roger Fullstop: now it works πŸ™‚
[14:19] Ventrella Linden: you rule.
[14:19] Roger Fullstop: I also added rotational puppeteering
[14:19] Roger Fullstop: i.e.: you can’t turn the gaze of the head right now
[14:19] Roger Fullstop: or rotate the hands
[14:19] Ventrella Linden: Good job.
[14:20] Roger Fullstop: so I added and extra modifer to do just that
[14:20] Ventrella Linden: Any other questions from the gallery?
[14:20] Evo Szuyuan: there was a queestion earlier about other avatars then humanoids
[14:20] Ventrella Linden: Ah…
[14:20] Ventrella Linden: A fave topic.
[14:20] paulie Femto: πŸ™‚
[14:20] Ventrella Linden: uh, what’s the question?
[14:21] Evo Szuyuan: could the humanoid limit be bypassed?
[14:21] paulie Femto: imagine puppeteering a 4 legged avatar. πŸ™‚
[14:21] Ventrella Linden: With enough coding, of course πŸ™‚
[14:21] Humming Pera: or leave the coding more open
[14:21] Ventrella Linden: If this were my own virtual world…
[14:21] Ventrella Linden: we’d all be open-ended animals to begin with,.
[14:21] Evo Szuyuan: other joints need to be known
[14:22] Ventrella Linden: Yes – here’s the thing…
[14:22] Humming Pera: open-ended would be much better
[14:22] Ventrella Linden: When you have a known skeleton, it’s much simpler.
[14:22] Ventrella Linden: when you want an open-ended skeleton, you need more procedural umph – like Spore.
[14:22] Saijanai Kuhn: MPEG-4 defines a generic humanoid, and skips to generic bone structure. Nothing inbetween
[14:23] Saijanai Kuhn: seems to me that optimzed quadrapeds and so on could be devised
[14:23] Ventrella Linden: Absolutely.
[14:23] Humming Pera: even speaking in these terms like skelton implies biological life, which this isn’t …
[14:23] Saijanai Kuhn: most furries are humanoid or four-leggged as far as I know
[14:23] Ventrella Linden: I developed an animal skeleton while at LL, but…
[14:23] Malburns Writer: a fish avie?
[14:23] Ventrella Linden: Like so many innovations like that,
[14:23] Ventrella Linden: they took a back seat to stability.
[14:23] Osprey Therian: 😦
[14:23] paulie Femto: 😦
[14:23] Evo Szuyuan: ahh.. so we can start nagging for that code too πŸ™‚
[14:23] Darien Caldwell: lol
[14:24] Saijanai Kuhn: that was the result of the OPen Letter caling for freezing innovation til all the bugs were worked out
[14:24] Ventrella Linden: I can’t speak to those decisions – not my call.
[14:25] Hypatia Callisto: I remember Runitai having worked on sculpties being able to adopt the skeleton of the avatar as well
[14:25] Hypatia Callisto: havent heard anything about that recently
{14:21] Political Magic: why is 3d video better than having sensors on your body?
[14:22] Roger Fullstop: political: few people like to put on markers…
[14:23] Political Magic: Is it true that using markers is much easier and more reliable than 3d video?
[14:24] Saijanai Kuhn: Political, it would have to be, but not many people want to wear little metalic dots on their face and clothing
[14:24] Roger Fullstop: political magic: it’s more reliable and precise so that’s what movie studio do, but it’s not easier
[14:25] Roger Fullstop: especially if you’re the one who need to put the markers on πŸ™‚
[14:25] Malburns Writer: i always imagined multiple sensor (4-8) in rl environment and wearing small
badge to calibrate self
[14:25] McGroarty Playfair: I’ve done mocap. For full body, you spend more time maintaining the markers and recalibrating than you spend shooting.
[14:25] Roger Fullstop: personally, I can’t imagine putting markers on my face going to work πŸ™‚
[14:25] Ventrella Linden: MArkers on glasses – a little easier.
[14:26] Roger Fullstop: yeah but that’s just a couple of points
[14:26] Osprey Therian: Need implants πŸ˜€
[14:26] Saijanai Kuhn: Johnny Lee showed using Wii input converted to goggles on youtube. Very cool. I invited him to this presentation but I dout he got the email
[14:26] Ventrella Linden: Yea – so real face MOCAP should be done optically.
[14:26] Ventrella Linden: with cameras.
[14:26] Roger Fullstop: I loved Johnny Lee Wii videos!
[14:27] Saijanai Kuhn:
[14:27] Roger Fullstop: The first demo I did was a copy of his “point of view tracking” demo but without the Wii
[14:27] McGroarty Playfair: Oooh nice
[14:27] Political Magic: How adaptable is the Wii technology to SL?
[14:28] Political Magic: And is this something to focus on?
[14:28] Roger Fullstop: I heard of some folks trying to adapt the wiimote
[14:28] Roger Fullstop: after all, there’s a joystick driver for it
[14:28] Bingo Onomatopoeia: I did that…
[14:28] Bingo Onomatopoeia: …moving with the nunchuk is nice
[14:28] Saijanai Kuhn: we need a generic plugin architecture for avatar control
[14:29] Saijanai Kuhn: needs to allow for mouse-like AND full pupeteering control
[14:29] Ventrella Linden: multiple inputs.
[14:29] Saijanai Kuhn: Ventrella, are those packets separate or part of some existing message packet?
[14:30] Ventrella Linden: what packets?
[14:30] Saijanai Kuhn: I put in the suggestion to map fundamental animation primitives to hotkeys or
chorded keys or even mdid inpu
[14:30] Ventrella Linden: (packet question forst..)
[14:30] Saijanai Kuhn: MIDI*
[14:30] Man Michinaga: yes.
[14:30] Man Michinaga: MIDI INPUT|!
[14:31] j3rry Paine: yayyyy sai
[14:31] Man Michinaga: That’s what SL NEEDS|!|
[14:31] paulie Femto: imagine playin a keyboard or instrumen to make yer av dance. πŸ™‚
[14:31] Ventrella Linden: sorry – there was another question…?
[14:31] Saijanai Kuhn: of course we need standardized MIDI instruments but that doesn’t exist for LInux afaik
[14:31] Tara Yeats: MIDI input would be very cool for puppeteering
[14:31] Oz Larg: Animusic πŸ˜€
[14:31] Osprey Therian: singing or humming hah
[14:32] Humming Pera: :-))
[14:32] Saijanai Kuhn: Ventralla, the packets that are sent for the pupetteering on the server side
[14:32] Ventrella Linden: sorry – what about the packets?
[14:33] Saijanai Kuhn: ok, let’s say yo do a control of hte avai’s joints. Is that cached and sent as a big chunk of BVH, or as a stram of separate modifications, or what
[14:33] Ventrella Linden: Ah…
[14:34] Ventrella Linden: Well, before Roger has his take on this one…
[14:34] Saijanai Kuhn: OK, thought canned animations were cached on client side. DIdn’t realize they were streamed already.
[14:34] Ventrella Linden: The goal is to not have to send too much stuff over whever you changce a
[14:34] Ventrella Linden: So, it’s NOT a BVH.
[14:34] Ventrella Linden: It was originally just the one joint…
[14:34] Ventrella Linden: and each viewer knew how to run the physics (well, kind of).
[14:35] Ventrella Linden: Then it was changed to all the joints sent over…
[14:35] Ventrella Linden: But the details were known by Cube Linden well, and now,
[14:35] Ventrella Linden: I think Roger knows that parts best. Roger?
[14:35] Roger Fullstop: Actually, I haven’t played with the roundtripping through server yet
[14:36] Roger Fullstop: I’ve seen and read the code though
[14:36] Roger Fullstop: and, indeed, it’s still the way you just described
[14:36] Evo Szuyuan: ah
[14:36] Saijanai Kuhn: OK, I’m the protocol documentation guy for the AWG so I get into this stuff. Sorry if i’m boring anyone
[14:36] Roger Fullstop: i.
[14:36] Roger Fullstop: I’ll be diving into the protocol part later this month
[14:37] Roger Fullstop: for the moment, I’m finishing the local tracking part
[14:37] Saijanai Kuhn: Should give me a holler if you need an assistant
[14:37] Roger Fullstop: Saijanai: cool! ping me on sldev
[14:37] Saijanai Kuhn: the eventual goal is to document all current protocols as a baseline for redesigning them for the open grid
[14:39] Saijanai Kuhn: was looking more intothe Body Animation Parameters (BAP) protocol. They have a pretty extensible system. NOt sure if its what is required for SL or not
[14:40] Saijanai Kuhn: shameless plug. If any technogeeks want to help design the SL 2.0 protocols, IM me for AW Groupies membership
[14:40] Saijanai Kuhn:
[14:41] Saijanai Kuhn: don’t tell ZEro I said SL “2.0” though
[14:41] Roger Fullstop thinks he should join that group
[14:41] Evo Szuyuan: the thing i wonder is how to proceed with puppeteering if we can’t look at server side of things..
[14:41] Evo Szuyuan: as long as it all works no problem ofcourse..
[14:41] Saijanai Kuhn: EVo as long as you now what the protocols are supposed to do, it shouldn’t matter TOO much
[14:41] Ventrella Linden: One idea is this…
[14:41] Ventrella Linden: if it’s just viewer-side,
[14:41] McGroarty Playfair: Evo: You could pass stuff encoded in chat as an interim fix.
[14:41] Rober1236 Jua: Yah work for Microsoft for a while you get good at it
[14:42] Evo Szuyuan: hehehehe
[14:42] Rober1236 Jua: Im serious
[14:42] Ventrella Linden: it can be used to save poses. – to make animations. That’s what Aura was
(is?) working on.
[14:42] Evo Szuyuan: aha!
[14:42] Hypatia Callisto: that’s fantastic, and pretty much like what we could do way back when in a
little place I was once in (rose)
[14:42] Roger Fullstop: That code is still in there though I haven’t tried it yet
[14:43] Evo Szuyuan: that’s the internal poser right?
[14:43] Roger Fullstop: yeap
[14:43] Evo Szuyuan: thanks saij
[14:44] Man Michinaga: what woudl be really cool is to map sensor input to nodes for puppeteeringof
av IK nodes or object control
[14:44] Roger Fullstop: the thing with puppeteering is that you need to track much more than a couple of points for realistic movements
[14:44] Political Magic: What is the difference between puppeteering and just animated bots in SL?
[14:44] Malburns Writer: animation=effects puppeteering=function ???
[14:44] Evo Szuyuan: animation is pre recorded
[14:45] Ventrella Linden: Yes – what Evo sez.
[14:45] Political Magic: OK, it’s real time, but that is waht an avatar does….how is puppeteering
[14:45] Ventrella Linden: As Evo was saying…
[14:45] Ventrella Linden: Normal animation ios pre-recorded.
[14:46] Ventrella Linden: Think of it as a movie being playes.
[14:46] Ventrella Linden: Then you stop the movie and
[14:46] Ventrella Linden: grab control of the skeleton – that’s puppeteering.
[14:46] Ventrella Linden: When you’re done, it falls back into normal mode.
[14:46] Ventrella Linden: That’s an oversimplification, ‘
[14:47] Ventrella Linden: but gets the basic idea, I think.
[14:48] Ventrella Linden: Well,
[14:48] Ventrella Linden: RL is calling me.
[14:48] Osprey Therian: Thank you very much, Ventrella.
[14:48] McGroarty Playfair: Best luck on current endeavors!
[14:49] Rober1236 Jua: Wondering, how we use our body is a function of the social context we are in,
does in not make sense to just create a vocublary of understood gestures for the context of Sl rather
than “puppeting” from RL?
[14:49] Malburns Writer: Yes – thanks all – highly interesting
[14:49] Evo Szuyuan: Yes JJ and Philippe !
[14:49] Evo Szuyuan: I’ll post the chat transcript on the dorkbot blog
[14:49] spinster Voom: oh good evo, think i need to reread it at least a couple of times lol
[14:49] Saijanai Kuhn: Tree Kyomoon has a nice wikifier for chat logs. Adds SL avatar pages and so on
[14:49] Rober1236 Jua: thank you
[14:49] Hypatia Callisto: thank you very much, this was a great presentation
[14:50] Evo Szuyuan: many many thanks for sharing this with us and being here
[14:50] Ventrella Linden: Roger – great work –
[14:50] Roger Fullstop: that was a pleasure to meet with you
[14:50] Ventrella Linden: glad you’re giving the code some TLC.
[14:50] Tara Yeats: excellent session!
[14:50] Ventrella Linden: ok – later all…
[14:51] Roger Fullstop: I’m going to stay around for another 10 minutes then sign off
[14:51] Osprey Therian: for upcoming event info where should we go Evo?
[14:51] Evo Szuyuan: i’d love to try the camera!
[14:51] Evo Szuyuan: please join dorkbot group
[14:51] Maxxo Klaar: thank you all – byeo
[14:51] Osprey Therian: is it open to join?
[14:52] Evo Szuyuan: yes..
[14:52] Evo Szuyuan: dorkbot SL ectronic art meetings
[14:52] Evo Szuyuan: Fau..
[14:52] Evo Szuyuan: what would you do if you had the 3d cam ?
[14:52] Fau Ferdinand: ah
[14:53] Fau Ferdinand: I’d pretend I’m looking for my head on the floor
[14:53] Bingo Onomatopoeia: I woould build a 3d sound-editor or synth
[14:53] Evo Szuyuan: πŸ˜€
[14:53] Roger Fullstop: there’s more to 3D cams than just puppeteering
[14:54] Evo Szuyuan: yes ofcourse!
[14:54] Evo Szuyuan: i would love to see face animation too
[14:54] Roger Fullstop: face animation needs to be done with mesh blending
[14:55] Roger Fullstop: in SL
[14:55] Roger Fullstop: I’d love to have someone working on this with me
[14:55] Evo Szuyuan: are you doing everythin by yourself now?
[14:55] Roger Fullstop: yeap, just me and Mitch… but he doesn’t code much… πŸ˜€
[14:55] Evo Szuyuan: haha.. no wonder you’re lead programmer πŸ™‚
[14:55] Evo Szuyuan: just kidding
[14:55] Saijanai Kuhn: Talk to Ina Centaur. She dricts the SL Shakseparean company and I kow they’ve been doing work with lipsynched mechanima. MIght be some programmers who could contribute
[14:55] Roger Fullstop: a leader of a team of *1* πŸ˜€
[14:55] Evo Szuyuan: I was hoping Mm Alder would be here
[14:55] Evo Szuyuan: he implemented lipsync for SL
[14:55] Saijanai Kuhn: this is exactly what they’ve ben hoping for, body and facial animation
[14:56] Evo Szuyuan: Ina is using crazy talk
[14:56] Saijanai Kuhn: ah, OK
[14:56] Roger Fullstop: I heard about that lipsync patch but haven’t had time to play with it
[14:56] Saijanai Kuhn: but I know they’ve talked aobut pupetteering hoping it would be available
[14:56] Evo Szuyuan: there has been a lot of talk on bringin more expression to avatars on the machinima group
[14:57] Oz Larg: Back in the way old days we had lip syncronization in the Traveler platform
[14:57] Saijanai Kuhn got a hippo nomination for typing too much
[14:57] Roger Fullstop: Those guys should love puppeteering + 3D cam πŸ™‚
[14:57] Evo Szuyuan: i found a link of discussion notes of 2006 on puppeteering
[14:57] Evo Szuyuan: they thought it would make SL nr 1 for machinima
[14:57] Saijanai Kuhn: well, I’m sure it would
[14:58] Evo Szuyuan: i think a lot of the concern was making the UI more comlicated
[14:58] Roger Fullstop: well, the cam should make the UI much more simple…
[14:58] Evo Szuyuan: absolutely!
[14:59] Roger Fullstop: that’s the goal at least
[14:59] Malburns Writer: i’m using lipsynch and love it – but needs wider range of gestures still
[14:59] Roger Fullstop: tantalizing…
[14:59] Oz Larg: Malburns: is that trans?
[14:59] Saijanai Kuhn: I heard that it only trigers on volume right now, not phonemes
[14:59] Malburns Writer: trans?
[15:00] Oz Larg: phonemes were used in the Traveler platform
[15:00] Oz Larg: built into the vocodec
[15:00] Saijanai Kuhn: when sound comes over the microphone, the louder the sound, the bigger the
mouth. Phonemes would use AI to shape the mouth differently for different sounds
[14:58] Evo Szuyuan: true
[14:59] Evo Szuyuan: right now you can only get information on the volume
[14:59] Evo Szuyuan: and to do phoneme detection you need an audio stream per avatar
[15:00] Evo Szuyuan: Vivox will come with new function for developers that hopefully will enable this
[14:59] Saijanai Kuhn wonders about non-english phonemes though
[15:01] Oz Larg: A lot of this phonemes stuff was done in the old day, I can hook you up to the people who designed it
[15:01] Saijanai Kuhn: trying to see if LInux users have MIDI playback in SL via QT movies. If they do, then you have a universal MIDI instruments format available. COuld use it to send long streams of animation synched with music
[15:02] Saijanai Kuhn is thinking an official VAG for pupeteering would be a good thing. GIve in an official page, tie-in with AWG/AW Groupies, etc
[15:02] Saijanai Kuhn:
[14:59] Rober1236 Jua: Animations are like words in the body language, animations will always be
needed like charcters
[15:00] Rober1236 Jua: you might pupper it but you will want to send certain animations rather than
having to act out long complex gestures
[15:00] Evo Szuyuan: maybe robert..
[15:00] Evo Szuyuan: but for example..
[15:01] Rober1236 Jua: Gesture only matters for meaning
[15:01] Evo Szuyuan: we’;re working on a movie right now..
[15:01] Evo Szuyuan: and there are no subtle anims in SL
[15:01] Evo Szuyuan: there is riding a horse
[15:01] Rober1236 Jua: well SL anims are primitive that is true
[15:01] Evo Szuyuan: but not ‘goin up the horse;
[15:02] Evo Szuyuan: also synchronization between avi’s is difficult
[15:02] Rober1236 Jua: the thing about gestures is they communicate
[15:02] Rober1236 Jua: so you can keyboard or type them
[15:02] Rober1236 Jua: you don’t scratch your head to scratch your head but to show you are wondering
[15:02] Rober1236 Jua: gestures that have no intention are like picking your nose
[15:02] Evo Szuyuan: if you have joint information you can have more fluid interaction between avatars
[15:02] Rober1236 Jua: that is true
[15:02] Rober1236 Jua: but the communication is selecting gestures
[15:03] Rober1236 Jua: that have meaning you wish to communicate
[15:03] Rober1236 Jua: a keyboard is very interesting
[15:03] Rober1236 Jua: or maybe some kind of plastic substance like a clay
[15:03] Roger Fullstop: Robert: I’m using gesture to communicate in RL but not in SL ’cause babysitting the avatar for it is too much work
[15:03] Rober1236 Jua: Well the goal of realism is that gesture is RL will be intentional
[15:04] Rober1236 Jua: so though puppeting might accomplish it the goal is communication not motion
for motion;s sake
[15:04] Rober1236 Jua: in fact I would like to move in ways I could not puppet
[15:04] Roger Fullstop: ? gestures in RL are intentional
[15:04] Rober1236 Jua: well some like picking our nose or rear are not
[15:05] Rober1236 Jua: but for the most part how we use our bodies in public is learned and has social intention
[15:05] Rober1236 Jua: gestures like scrathing or petting things are often considered rude
[15:05] Roger Fullstop: what about projecting that social intention with no or little UI?
[15:05] Rober1236 Jua: well that is not very much fun
[15:06] Rober1236 Jua: I can just text that
[15:06] Evo Szuyuan picks her nose
[15:06] Rober1236 Jua looks bored with a world without gesture
[15:06] Rober1236 Jua snickers at someone picking thier nose
[15:06] Rober1236 Jua: but you just made a metagesture
[15:06] Roger Fullstop: right now, I’m not looking at anyone in SL, just the stream of text in the IM window
[15:06] Rober1236 Jua: you made a gesture to reference a gesture which means its a message
[15:06] Roger Fullstop: all the comm is there
[15:07] Rober1236 Jua: Yes but it is dull
[15:07] Roger Fullstop: what’s the plus of being in SL is everything is entered and rendered as text?
[15:07] Rober1236 Jua: it lacks the shades and colours that a full body can bring to communication
[15:07] Evo Szuyuan: i think you can have both
[15:07] Rober1236 Jua: We are all in agreement on this, I just want to make clear that most linguists view gesture as communication
[15:07] Roger Fullstop: I think we need both
[15:07] Evo Szuyuan: i wouldn’t want everything to be just like RL here
[15:08] Rober1236 Jua: communication and not fiedelity of body movement must be the objective
[15:08] Roger Fullstop: gesture is comm
[15:08] Evo Szuyuan: but i certainly want more freedom of expression
[15:08] Rober1236 Jua: Same here, I want to do back fligs when I am happy
[15:08] Roger Fullstop: that’s why you want it in SL, with little or no extra “texting” to transfer it
[15:08] Rober1236 Jua: I want to jump over the moon with joy
[15:08] Rober1236 Jua: but what coding, what words, and how is it expressed?
[15:09] Evo Szuyuan: but when someone is talking i also want to know they are talking..
[15:09] Rober1236 Jua: Yes and how they use their arms and feet
[15:09] Evo Szuyuan: and right now that’s very difficult
[15:09] Rober1236 Jua: and I want to also fill my hair with wind when I am being important
[15:09] Roger Fullstop: indeed, all that communicate
[15:09] Rober1236 Jua: right now we don’t know how to do it
[15:09] Rober1236 Jua: or maybe Google does
[15:09] Malburns Writer: maybe voice recognition software could someday help animation avatar
[15:09] Rober1236 Jua: but AI has not come far in Semantics
[15:10] Rober1236 Jua: well I would point to the Sociologist Maus and say we learn to use our body to
[15:10] Evo Szuyuan: i think a lot of communication is in subtle body language
[15:10] Rober1236 Jua: in IT we give people tools and they lean to use it
[15:10] Saijanai Kuhn: human readable languages are difficult to program in
[15:10] Rober1236 Jua: well I think probably the richer tool concept it better
[15:10] Evo Szuyuan: not just in language
[15:10] Rober1236 Jua: let people have ore control over av
[15:11] Saijanai Kuhn: the rulesowrk just fine for reading, but when trying to write NEW stuff in a human-like language, it gets kinda silly (like AppleScript)
[15:11] Roger Fullstop: sure, that’s the idea
[15:11] Roger Fullstop: the cam is not all or nothing
[15:11] Evo Szuyuan: exaclty
[15:11] Roger Fullstop: it’s an extra input device
[15:11] Evo Szuyuan: it doesn not exclude anything else
[15:11] Roger Fullstop: why should we be limited to keyboard and mice?
[15:12] Evo Szuyuan: to get RSI ?
[15:12] Rober1236 Jua: well for accessability we should make almost everything utlimately keyboard and mice
[15:12] Rober1236 Jua: actually keyboard
[15:12] Rober1236 Jua: but we can add other devices that make it richer
[15:12] Evo Szuyuan: like 3d cam ?
[15:12] Evo Szuyuan: πŸ˜€
[15:12] Roger Fullstop: some people can’t use keyboards…
[15:13] Rober1236 Jua: 4-D cam, with ability to go back sounds great
[15:13] Evo Szuyuan:
[15:13] Evo Szuyuan: these were the 2006 notes on puppeteering..
[15:13] Evo Szuyuan: for those interested
[15:13] Rober1236 Jua: excellent
[15:13] Roger Fullstop checks that
[15:14] Evo Szuyuan: if i could puppetter other avatars i would put bingo straight πŸ˜‰
[15:14] Evo Szuyuan: he’s been hanging there forever!
[15:14] Saijanai Kuhn: BTW, you probably missed it in the spam, but someone developed a text-to- american sign web app
[15:15] Saijanai Kuhn:
[15:15] Bingo Onomatopoeia: ehheh
[15:15] Bingo Onomatopoeia: funny, didnt notice that πŸ™‚
[15:15] Saijanai Kuhn: doenst’ work on my Mac, but the interface looks cool
[15:15] Evo Szuyuan: interesting
[15:16] Saijanai Kuhn: seems like you could have primitive AI looking fo rkey words or phrases and do
custom gestures instead of the ASL
[15:16] Roger Fullstop: ok guys, I think RL (in the shape of my 8 year old daughter) is calling me
[15:16] Evo Szuyuan: ok..


Entry filed under: Dorkbot, media art, second life.

Dorkbot Session Announcement

3 Comments Add your own

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Trackback this post  |  Subscribe to the comments via RSS Feed


July 2008

Most Recent Posts

%d bloggers like this: