Dorkbot Session Announcement

July 3, 2008 at 2:38 pm 4 comments

The upcoming Dorkbot session will take place 6 July 1:00 pm PDT / 22:00 CET at the Odyssey Simulator!
http://slurl.com/secondlife/Odyssey/85/153/45/

This dorkbot session will be of interest to developers, performers, machinimators, animators, photographers, and just about anybody with an interest to bring more expression to avatars and more direct ways to interface with our embodied selves.

JJ Ventrella (a.k.a. Ventrella Linden) will present the avatar puppeteering project. A project motivated to bring more expression to the avatar, by enabling a more fluid and direct way to manipulate the ‘physical avatar’. This is a Linden Lab project that unfortunetely was ‘put to sleep’ as Linden Lab decided to focus on the stability of Second Life and internal opinions differed over how compelling the feature really is.
Last month however, Linden Lab released the client source code of this project, to allow other people to build on it (and convince them otherwise).
Philippe Bossut, will present Handsfree3D, a project that enables to interface with virtual worlds via 3D camera motion tracking. Philippe Bossut has developed several demos to demonstrate how one can navigate and execute all sorts of tasks without the need for a mouse and keyboard, and over the last month he has been working on connecting his code with the puppeteering feature.

IM Evo Szuyuan for questions (or technical issues you might experience during the event).


JJ Ventrella

Bio
JJ Ventrella is a programmer-artist who specializes in Virtual Worlds and Artificial Life. After three college degrees (Virginia Commonwealth University, Syracuse University, and the MIT Media Lab), he moved to San Francisco to do simulation-based game design at Rocket Science Games. He then joined Will Harvey in founding There.com, where he was Principle Inventor for 5 years and invented many of the avatar communication technologies. He also created the There Dog, vehicle physics, and many other technology/designs. After a short stint at Adobe developing JavaScripts for Acrobat3D, Ventrella spent two years at Linden Lab, where he invented FollowCam, Flexies, and Puppeteering. In his own time, Ventrella has been rolling out versions of his artificial life game GenePool, and publishing papers about interactive animation and artificial life. Currently Ventrella is doing the final touches on the NASA Images home page, at the Internet Archive. Soon he will head up to Vancouver to teach a Virtual Worlds class at the Masters of Digital Media, and to start writing his first book. You can see many of Ventrella’s creations at www.ventrella.com.


Philippe Bossut

Bio
Philippe Bossut (Roger Fullstop in Second Life) is a software engineer who made all of his career in Computer Graphics and Desktop applications development. He currently works as an “Entrepreneur in Residence” at Kapor Enterprises Inc (KEI) where he leads the Segalen project. Prior to Segalen, he worked 3 years as an engineering manager at OSAF (Open Source Application Foundation) and, before that, hold various management positions at Macromedia and Microsoft. Back in the 90’s, he was one of the early developers of Live Picture, an award winning image compositing application for the Mac that helped popularize resolution independent non destructive image editing. That entrepreneurial adventure took him away from his native France and brought him to California 14 years ago (though, considering his heavy accented English, you could swear he just landed…). He holds a PhD in Computer Graphics from Ecole des Mines de Paris and an engineering degree in Geology from Ecole des Mines de Saint Etienne. He also has a long passion for Archaeology and published some papers on the subject. His personal blog can be read at : http://pbossut.wordpress.com/

Segalen / HandsFree3D
Segalen is an attempt to bring dramatic user experience improvement to Virtual Worlds in general and Second Life in particular. The initial momentum was given by the emergence of “3D” cameras from a handful of vendors, based on a variety of technologies. Though none of those cameras were yet available to the public, Mitch Kapor decided to fund a small experimental project with Philippe Bossut to explore their capabilities. The project started in January 2008 with some camera prototypes and SDKs. In May, Segalen posted its first video demo, rapidly followed by a second one. The first one was to show that the camera and feature extraction were fast enough to completely replace the use of keyboard for in world navigation. The second focused on precision tracking, showing that precise click and drag as required in object editing for instance was achievable. Those videos received a very wide coverage in the blogsphere and the traditional press. After this initial success, the project zeroed on the true objective: combining digital puppeteering with real time motion capture. Most of the development since May has been to debug and rearchitecture the “puppeteering branch” of the open source slviewer. This work is now complete (see JIRA VWR 7703) and the project if now refocused on plugging the camera feature tracking input into the viewer.

Philippe Bossut (Roger Fullstop), San Francisco (Second Life)
2008

Advertisements

Entry filed under: Dorkbot, media art, rhizomatic, second life.

Dorkbot Session Announcement Report on the Dorkbot Meeting of 6 july

4 Comments Add your own

  • 1. Saijanai Kuhn  |  July 4, 2008 at 6:25 am

    Definitely try to be there. And… I’d like to point out that a “pointing device” is probably not the best way to do real-time animation control. 3D mocap using a 3D camera might be the most sexy, but there’s always multiple keypresses (especially on a Mac, and dare I mention MIDI keyboards?

    I could see an infinite number of non-mocap interfaces possible for this technology, such a the animation equivalent of stroke-based font creation, where fundamental animation movements would mapped to individual keys on a computer or MIDI keyboard and evoked by sequences of simple or chorded keypresses.

    http://www.macintouch.com/gaiji.html

    No doubt other input technologies could be devised as well. Perhaps, rather than trying to figure out a one-size fits all strategy, or a handful, the best thing to do would be to devise a plug-in architecture for the overall system.

    L

  • […] post info By charles1109 Categories: Uncategorized Dorkbot Session Announcement […]

  • […] project, and his work on connecting this 3D camera tracking to the puppeteering feature. Read the Meeting Announcement for more information and bio’s of the presenters Over 70 people showed up, so obviously there […]

  • 4. Recent Links Tagged With "dorkbot" - JabberTags  |  October 12, 2008 at 6:19 am

    […] links >> dorkbot Dorkbot 16 – Aug 21 at Cafe Mundi Saved by eXtraordinarycat on Sat 11-10-2008 Dorkbot Session Announcement Saved by lauralf on Fri 10-10-2008 Dorkbot Austin: Call For Presenters Saved by stephenayer on […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Trackback this post  |  Subscribe to the comments via RSS Feed


Calendar

July 2008
M T W T F S S
« Mar    
 123456
78910111213
14151617181920
21222324252627
28293031  

Most Recent Posts


%d bloggers like this: