Monday, February 28, 2011

MotionBuilder

How to connect a Kinect to the MontionBuilder.
( http://www.youtube.com/watch?v=_PtxfRpCBHE )


1.
Go to 'Asset Browser' and the 'Templates'. Choose Characters and  Actor.
Drag it to 'Viewer'.
In addition, go to the 'Devices' of the same folder,
and select 'Brekel Konect Device' and drag it to the 'Navigator' tab of the 'Navigator' panel.
Then, we will see the 'Devices' icon, and 'Brekel Konect Device' as well.


2.
Double click the 'Brekel Kinect Device', and hit the 'Data' tab.
Initially, the 'Online' check box is in red, and need to be changed to yellow.


3.
Run the Brekel Kinect, and take the initial pose.
Then, we could see the data in the spreadsheet is changing depending on your motion.


4.
Before recording, we have to create a skeleton.
Just go to the 'Model binding', and select the 'Create...',
then it will add a 'Kinect:Reference' menu,
in addition, check out the 'Scene' icon on the left.
You can find a 'Kinect:Reference' as well over there.
The reason of recording is to get a one stable pose
instead of standing in front of a Kinect for 10 mins.
And align the virtual actor to the skeleton.
(Before recording, check 'Recording')
After recording, the 'Online' should be turned off. 
Make sure the plug-in is set to "Online" indicated by the green square, and that "Live" and "Recording" are checked. To start recording, click on the record and play button in the "Transport Controls".

Note: A user should stand enough far from the kinect
so that the whole body in the camera view.
Instead of using a body, you'd better use skeleton as follows.



5.
Move an 'Actor' to the Viewer. Then, you will get an 'Actor' in the 'Navigator' pane.
Then, generate 'MakerSet' as follows.




6.
Go back to the 'Scene', and assign each body part to each joint of the virtual actor.
Check 'Oriented' for each one.


7.
"If the directory that contains your character is not added yet, right click on the left side of the browser. From the menu that pops up select "Add favorite path" (figure 1). Add the folder that contains the character. To use the characters that come with Live Characters add the "LiveCharacters" folder that can be found at "C:\Program Files\WorldViz\" if you installed Live Characters to the default location. Click on the folder that got added in the "Asset Browser" and select the character file that is already characterized. "
(From Retarget Motion to a Character)
Load a live character by dragging it to the viewer.

Make sure the "Character Settings" tab is selected. As "Input Type:" select "Actor Input" (figure 2 red arrow) and click "Active" to activate your character


8.
Finally, to animate the actor based on user's motion, the 'Active' should be checked.  :)




Note:
From 'C:\Program Files (x86)\WorldViz\LiveCharacters\help\LiveCharactersHelp.chm'
Record a T-Stance Pose
Map Optical Data to an Actor
Retarget Motion to a Character




 Issues:
+ MotionBuilder Error (VCOMP90.dll is missing)
http://www.the-area.com/forum/autodesk-motionbuilder/autodesk-motionbuilder-2009/mb2010---vcomp90dll-missing-after-install40with-no-errors-or-warnings41/

multi-touch in Windows

http://channel9.msdn.com/blogs/yochay/windows-7-mutli-touch-overview

Friday, February 25, 2011

Glut / GLSL Tutorials

Glut
http://www.lighthouse3d.com/opengl/glut/index.php?subwin
(http://www.xmission.com/~nate/glut.html)


GLSL
http://www.lighthouse3d.com/opengl/glsl/

Creating multiple/sub-windows using glut

http://www.lighthouse3d.com/opengl/glut/index.php?subwinrender


Creating multiple windows..
possible.
http://lists.apple.com/archives/mac-opengl/2003/Apr/msg00078.html

But why not with a real-time video feed ?
I was able to create multiple windows with the above approach,
however, input video was not updated..  :(
Any idea ?

Issues with FBO (Frame buffer object)
http://www.openframeworks.cc/forum/viewtopic.php?f=8&t=1325

One possible solution is to use OF (Open Frameworks).
http://www.quietless.com/kitchen/getting-started-with-openframeworks/

Thursday, February 24, 2011

send out keyboard message win32

I found this article which is very useful.
http://www.codeproject.com/KB/cpp/sendkeys_cpp_Article.aspx
You can download a code as well.


This was also useful.

void GenerateKey ( int vk , BOOL bExtended)
{
KEYBDINPUT  kb={0};
INPUT    Input={0};
// generate down
if ( bExtended )
kb.dwFlags  = KEYEVENTF_EXTENDEDKEY;
kb.wVk  = vk;
Input.type  = INPUT_KEYBOARD;

Input.ki  = kb;
::SendInput(1,&Input,sizeof(Input));

// generate up
::ZeroMemory(&kb,sizeof(KEYBDINPUT));
::ZeroMemory(&Input,sizeof(INPUT));
kb.dwFlags  =  KEYEVENTF_KEYUP;
if ( bExtended )
kb.dwFlags  |= KEYEVENTF_EXTENDEDKEY;

kb.wVk    =  vk;
Input.type  =  INPUT_KEYBOARD;
Input.ki  =  kb;
::SendInput(1,&Input,sizeof(Input));
}

Wednesday, February 23, 2011

VRPN

This page is a great starting point to learn about VRPN.
http://www.vrgeeks.org/vrpn/tutorial---use-vrpn#TOC-Adding-Buttons-and-Trackers


Of course, this page is the official VRPN page.
http://www.cs.unc.edu/Research/vrpn/


Open the  vrpn.sln, and compile vrpn_server (for server side), and vrpn_print_devices (for client side).
Before running, copy vrpn.cfg to the  vrpn_server directory, and edit it (uncomment Mouse0 part).
Then, run the server first, and then client. Move your mouse, and you will get results.

If you happen to confront the following message, then you didn't uncomment devices yet.
VRPN Error
 (10) from Tracker0: No response from server for >= 10 seconds
VRPN Error
 (10) from Mouse0: No response from server for >= 10 seconds


Please note
.\VRPN\vrpn_07_28\server_src\vrpn.cfg

client_and_server.C could a good staring point to understand how to implement server/client.
and this page. (http://www.cs.unc.edu/Research/vrpn/Servers.html)

Create a new project, and add the file client_and_server.cpp, and edit a bit.
#include "./../../vrpn_Tracker.h"
and add vrpn.lib (which was at C:\Program Files (x86)\vrpn_07_28\pc_win32\Debug)


// VRPN example **//
http://openvibe.inria.fr/documentation/unstable/Doc_VRApplicationAndVRPN.html

// Issues **//
glutMainLoop & VRPN loop
http://lists.unc.edu/read/messages?id=5271031
testimager_client.C (testimager_client)

directshow_video_server example is also useful

Thursday, February 17, 2011

How to get BOOST working on XP and Visual C++ 2005

http://bytes.com/topic/python/answers/583008-how-get-boost-working-xp-visual-c-2005-a
 
Osiris
 
Posts: n/a
#1: Jan 2 '07
My experiences with BOOST on Windows XP and Visual C++ 2005

I'm new to Python.
I built software in more than ten other computer languages. I'm not
sure if that is not a handicap, when reading documentation of a system
like BOOST.
However:
It took me more than four full working days to get a Python/C/C++
'hello world!' program to work. There is a lot of documentation, but
somehow it did not 'click' with my previous experience. I think the
doc was written by very, very capable C++ and Python programmers, who
forgot how the world was before they got involved with Python.
A teacher/doc writer should never forget that, I think.
Mind you: the documentation is all there. Stressing the word ALL.
There is a lot of documentation. Which makes it very difficult to
choose the right piece.

My project is, to use old and tested and stable and error-free C-code
in a website built with Zope (www.zope.org) and Plone (www.plone.org).
So I think, C-code can be wrapped in some C++ and then, with the help
of Boost, be called from Python. Python is important when using Plone.

So let me summarize what I found out.

BOOST is a system to combine Python and C++. Call C++ compiled code
from Python, which is interpreted.
In the end, BOOST is a sort of "make" facility like found on Linux
(and Windows). In Boost it is called, not 'make.exe', but 'bjam.exe'.
You define a list of operations, which bjam executes for you. It runs
from the command line (I grew up, IT-wise, in the DOS-era. Hurray), it
has no GUI-like Windows front-end.

So the first step is to get bjam.exe from the net. There are two ways
to get it:
1. download a ready-to-run bjam.exe from
http://downloads.sourceforge.net/boo...7&big_mirror=0.
In the zip you will find the bjam.exe, which you put it in a place
where the system can always find it, like in c:\, or somewhere else in
the system's PATH.
2. download the sources for bjam.exe and build it yourself:
http://downloads.sourceforge.net/boo...5&big_mirror=0
.. I recommend not to do this, if you don't have to. And on Windows XP,
you don't have to. You could spend a lot of time to figure out how to
set up things before even building bjam.exe.

The second step is to get Boost libraries. These are for combining
with your C/C++ source, so Python can access the C/C++ code.
You download this stuff from Sourceforge:
http://downloads.sourceforge.net/boo...1&big_mirror=0
It is a zip file that you unzip to a convenient location. I unzipped
to D:\ so I got a folder named d:\boost_1_31_1 with all the stuff in
it. I renamed this folder to d:\boost, to get rid of all the messy
version numbers.
To build the boost libraries from these sources, you need bjam, and
bjam makes use of your C/C++ compiler. In my case that was Microsoft
Visual C++ 2005, which has version number 8.0.
Now you have to make bjam and Visual C++ acquainted with the location
of supporting software. I made a BAT file to do this. This is what is
in that .BAT file, named SET.BAT and which I put in D:\boost:

d:
cd \boost
call e:\msvc\vc\vcvarsall.bat
set VC80_ROOT=e:\msvc\vc
set TOOLS=vc-8_0
set PYTHON_ROOT=c:\python24
set PYTHON_VERSION=2.4

I explain:
e:\msvc is where I installed my Visual C++ system. The Microsoft
default location would be something like C:\Microsoft Visual C 2005\
etc, but I preferred D:\msvc.
Change the SET.BAT file as needed .
My IDLE (http://www.python.org/idle/) Python 2.4 is in C:\python24
The value 'vc-8_0' denotes the boost identification of my MS Visual
C++ system. If you use an other C++ system, it must be something else
(see http://www.boost.org/more/getting_started.html)

Now start a DOS box: Click the Start button in the lower left corner,
click on "run" and type "cmd".
There you switch to D:\ and change directory to \BOOST.
Execute the SET.BAT.
Then just type "bjam" (the system will find the program bjam itself,
because it is in the PATH)

Now get a lot of coffee, because the build will take a LONG time,
maybe 15 minutes or more.
You will see a lot of cpp-compiling and linking going on and some
warnings about 'deprecation', which are irrelevant.

When finished, keep the DOS box open. You will find BOOST stuff in
C:\boost, the default location for the compiling results.

Now try out an example. In the DOS box, go to
D:\boost\libs\python\example\tutorial, where you find a 'hello'
example and a Jamfile. Jamfile's are what a makefile is for make.: a
script for bjam to build all the 'hello' stuff needed for python.

Type 'bjam' again, and take a smaller amount of coffee. The system
will build the 'hello' stuff for you. Do not be alarmed about bjam
finding 1200+ 'targets' and rebuilding 40 of them, when only needing
to compile hello.cpp…. this is normal behaviour.

When bjam has finished, you will find 'hello' stuff in the
unbelievably deep folder
D:\boost\libs\python\example\tutorial\bin\tutorial \hello.pyd\vc-8_0\debug\threading-multi
Really. I don't know why this must be so deep.
And some Boost stuff in the even more deep folder
D:\boost\libs\python\example\tutorial\bin\boost\li bs\python\build\boost_python.dll\vc-8_0\debug\threading-multi

Find the hello.pyd and boost_python.dll and move them to the folder
where yout python.exe is, in my case c:\python24.

Now you go to
file:///D:/boost/libs/python/doc/tutorial/doc/html/python/hello.html
which is part of the Boost download. and read the page.
That should get you on your way.
Of course, now it might be beneficial to you to start reading the
other documentation on
http://www.boost.org/more/getting_started.html

Tuesday, February 15, 2011

Connect Kinect to PC Using PrimeSense Drivers

I got this info from this page.
http://groups.google.com/group/openkinect/browse_thread/thread/3330d5ebb79995ae?tvc=2


Hi

I was confused with different instructions for connecting Kinect to PC.
Actually I couldn't install OpenKinect drivers, but I come up with an
instruction 4 Dummies! :D
I would be glad if somebody makes something like this for OpenKinect.

Sajjad


Step 1

Uninstall any previews drivers, such as CLNUI, OpenKinect,....
Uninstalling a driver does not seem to be easy esp in case of OpenKinect


Step 2

Download and install the latest stable or unstable OpenNI Binaries from
OpenNI website.
http://www.openni.org/downloadfiles/2-openni-binaries
There might be a security check message in this step and the next one. Don't
care and continue installing.


Step 3

Download  Kinect Driver from the following link. The file name will be
avin2-SensorKinect-0124bd2.zip.
https://github.com/avin2/SensorKinect
Unzip the file and run
 avin2-SensorKinect-0124bd2\Bin\SensorKinect-Win32-5.0.0.exe


Step 4
Restart


Step 5
Plug in Kinect. Wait till Windows finds and installs the drivers. Check if
the camera and motor drivers are installed through Control Panel->System
and..->System->Device Manager
Currently no driver for Audio available by Primesense, although there is one
provided by OpenKinect.


Step6

Check OpenNI samples.  NiSimpleRead just reads the data for the center of
view into a console window.  NiViewer gives you a color coded depth view
what the Kinect is looking at.


Step 7
Download and install the latest stable or unstable OpenNI Compliant
Middleware Binaries (NITE) from OpenNI website.

During installation, provide the following (free) PrimeSense key:
0KOIk2JeIBYClPWVnMoRKn5cdY4=


Step 8

Restart.


Step 9

Replace xml files in C:\Program Files (x86)\Prime Sense\NITE\Data and
C:\Program Files (x86)\OpenNI\Data with the ones in
avin2-SensorKinect-0124bd2\NITE\Data and
avin2-SensorKinect-0124bd2\OpenNI\Data


Step 10

Checkout the NITE samples and have fun!


Note:
Error: the procedure entry point xnproductionNodeRelease could not be located in the dynamic link library openNI.dll
Sol: Ok, It is a version issue after all. By using unstable release of OpenNI/NITE, and then re-install the primesense kinect mods solves my problem.


==========================================



Nathan wanted me to share with you how we got the Kinect hookuped with Motion Builder. It's straight forward for the most part:

Go to http://www.brekel.com to get the Application and plugin for MotionBuilder

And here are the specific instructions/drivers that Brekel needs for it to work with MotionBuilder:

~~~~~~~~~~~~~~~~~~~~~~
1) Make sure you have a separate power adapter for your Kinect, you'll need it.
    If you bought a standalone Kinect it came with one in the box.
    If you bought a Kinect with a XBox360 bundle you may have to order one from Microsoft.

2) If you already have Kinect drivers installed that are NOT from OpenNI, make sure you delete them from your system first!
    For example the OpenKinect/libfreenect ones or the Code Laboratories CL NUI ones.
    These usually show up under a "libusb" folder in your Windows Device Manager, where you can also delete them,
 
3) Download and install OpenNI
    http://www.openni.org/downloadfiles/openni-binaries/21-stable
    Note that you want the stable version (v1.0.0.23), not the unstable one (v1.0.0.25)

4) Download and install the Kinect drivers
    https://github.com/avin2/SensorKinect/tree/master
    Run SensorKinect-Win32-5.0.0.exe file from the Bin folder
    Note that you don't pull the one from the unstable branch, but the one from the master branch
    (If you see an OSX and Linux installer in the bin folder you've got the wrong one!)
 
5) Download and install NITE (user tracking module)
    http://www.openni.org/downloadfiles/openni-compliant-middleware-binaries/34-stable
    Use this key during installation:    0KOIk2JeIBYClPWVnMoRKn5cdY4=

6) Make sure your Kinect is connected directly to your computer not through a USB hub
    Your Windows Device Manager should look something like this:
  
7) Check if everything works by running one of the OpenNI samples:
    C:\Program Files (x86)\OpenNI\Samples\Bin\Release\NiSimpleViewer.exe
    This should display a fullscreen videostream of the depth feed from your Kinect
  
8) If the sample works then you're ready to start Brekel Kinect 3D Scanner from your windows start menu.
    If you have problems the OpenNI google groups are a great source for help and news:
    http://groups.google.com/group/openni-dev

Note: If your firewall asks for permission to open a port please hit accept.
This is needed for streaming the data across your local network, or between applications.
~~~~~~~~~~~~~~~~~~~~~~

I got ask far as getting the Kinect work with the Brekel and MotionBuilder to recognize the Kinect. But I don't know how to use MotionBuilder so... currently stuck there.

-Albert =D

--
Albert C. Lai | albert@immersivetech.org | Meddling Genius
310.384.9838 | 3550 Wilshire Blvd. #1520, Los Angeles, CA 90010
Neuroscience | Biomedical Engineering | Computer Science | Public Administration | International Relations


//////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// **//
Issues :
+ "InitFromXml failed: Can't create any node of the requested type! " 
http://groups.google.com/group/openni-dev/msg/51505a9aa9c5ab9d
"Kinect Camera" was not detected correct in the "device manager".

+ InitFromXml failed: Device Protocol: Bad Parameter sent!
1) http://groups.google.com/group/openni-dev/browse_thread/thread/c50238431e6dec4f
or
2) Replace xml files in C:\Program Files (x86)\Prime Sense\NITE\Data and
C:\Program Files (x86)\OpenNI\Data with the ones in
avin2-SensorKinect-0124bd2\NITE\Data and
avin2-SensorKinect-0124bd2\OpenNI\Data

+ Got an error while reading network buffer: Xiron OS failed to receive a network buffer!
 Replace xml files in C:\Program Files (x86)\Prime Sense\NITE\Data and
C:\Program Files (x86)\OpenNI\Data with the ones in
avin2-SensorKinect-0124bd2\NITE\Data and
avin2-SensorKinect-0124bd2\OpenNI\Data