A New Bio-informatics Framework: Research on 3d Sensor Data of Human Activities

—Due to increasing attraction of motion capture systems technology and the usage of captured data in wide range of research-oriented applications, a framework has developed as an improved version of MOCAP TOOLBOX in Matlab platform. Firstly, we have introduced a faithful script to deal with public motion capture data, which will be friendly for us. Various functions through dynamic programming, by using the Body Segment Parameters (BSP) are edited and they configured the position of markers according to data. It is used to visualize and refine without the MLS view and the C3D editor software. It has opened a valuable way of sensor data in many research aspects as gait movements, marker analysis, compression and motion pattern, bioinformatics, and animation. As a result, performed on CMU and ACCAD public mocap data, and achieved higher corrected configuration scheme of 3D markers when compared with the prior art, especially for C3D file. Another distinction of this work is that it handles the extra markers distortion, and provides the meaningful way to use captured data.


INTRODUCTION
Due to improvement in motion, capture technologies such as optical, mechanical, or magnetic sensors are attached to human joints and their movements are record.Such systems depend on an active source, which emits pulses of infrared light with a high frequency, which is reflected by small spherical markers, or LEDs attached to track the subject (e.g.contribution of subject walking, or running).In motion capture system, every camera captures the position of reflective markers in two-dimensional cameras.Network of system computes position data in 3-dimensions.Now more and more researchers are interested to use 3D-mocap data with different types of application; such as retarget [1][2][3], analysis [4], animation [5] and surveillance systems [6].In addition, they have demanded for different types of toolbox to utilize the variety of data ASF/AMC, BVH [7], and C3D [8] file format for multiple research purposes.Some of them have interested to develop mocap toolbox; which are helpful to other scientists.They are focused to use the captured data in various research directions.Recently, Jeroen and Boxtel [9] have developed a toolbox (Biomotion toolbox) in the Matlab environment that can read and display the different types of mocap data by using Psychtoolbox-3 [10].This third party toolbox (Psychtoolbox-3) is not exactly suitable for biomotion.However, some features of the biomotion are limited, and have specific designs to display and manipulate point-light displays (PLD).Charles Vernon [11], developed a toolbox with a limited number of functions.It provides a graphical user interface (GUI).The major mocap toolbox in MathWorks platform is mainly dealing with recorded data by infrared marker based optical motion capture system.
The MathWork provides precompiled functions [12], and they are used on different types of data.Some of the functions are used as part of mocap toolbox, such as PCA and ICA packages, Signal Processing and Statistical Toolbox.So user can design functions and scripts in the Matlab environment according to their requirements.Recently, Burger and Petri [13] developed the MoCap Toolbox having 64 functions, excluding other toolbox packages.These functions have been used to visualize and analyze captured data and have the capability to read the different types of data format.They notified three parts: 1)-Motion Data Structure (MDS), 2)-Segment Data Structure (SDS) and 3)-Normal Data Structure.These structures have interconnection and processed.Their www.ijacsa.thesai.orgcomputation implied statistical and mathematical methods in order to propose a homogeneous framework and their analysis and simulation (animation).They also claimed that their work can read C3D file, but still it has some indispensable issues to trade with such type of data.It cannot read the public mocap data.We have inspired of work [14], our focus is to refine functions and use them on data (C3D format) which will be used for different purposes such as in clinical field, retarget motion and animation.In [13], we have big issues and challenges for mocap data researchers.These are: 1)-reading the C3D file not displaying in human skeleton shape (see Figure 1 a), 2)-Markers configuration which shows the motion data structure of the human skeleton, 3)-Normalized the data of the human skeleton (see Figure 1 b, c); and 4) -required Qualisys software to manage the .datand.mat data format.We have tried hard to resolve these individual issues see.
Output= mcread ('E: \walk.c3d');(1) We use some of the existing functions after embedding script (C3D_VaxD2PC) into the toolbox and read the data successfully by (1).The following functions have been used to display calibrated 3D markers position and form them into the human skeleton shape.However, it is not like human skeleton.It looks like network connection between marker nodes because some existing toolbox functions are not operated according to marker positions (see Figure - Keeping the above issues, we have refined some functions of Mocap Toolbox to stem these issues and compiled them successfully according to data [15]& [16].It can be associated with other toolboxes in the Matlab environment such as Mocap136 [17] and Robotics [18].The pictorial structure of the improved version of Toolbox has been given in Figure 2. Lot of research works have been done on mocap data.Improved version can be used in different research fields such as joint analysis, design the locomotion pattern, retarget motion, human skeleton animation, motion classification, 3D pose estimation and human identification by using 3D mocap databases plausible with the human body segment parameters (BSP).
The rest of this paper is set up as follows.Section II gives a concise story about motion capture systems, and data preprocessing.Section III describes the configuration scheme of sensors and data visualization functions.Brief descriptions of the human Body Segment Parameters (BSP) and data normalization are discussed in section IV.The experimental results, conclusion and future work are explained in section V. www.ijacsa.thesai.org

A. Mocap Data Preprocessing
Preprocessing of data is a primary step to get accurate results in any scientific research [19].Data captured by ubiquitous sensors based on subject movement is stored as the C3D file format with necessary attribute such as start and end times, sensor id and sensor values.(2) It converts the C3D files into PC format, because these files depend on several types of hardware and floating-point.After compiling data through (2).MocapToolbox mcread () function, read C3D files, and arrange them into a reliable database.Here its name is called "Processed Data for MocapToolbox".After reading the C3D file successfully, some remaining typical issues have shown in Figure 1 (c & d).They have affected the standard human skeleton; attached markers on human body and joint positions.
Both of these issues will be addressed in the section III through mapar.connand japar.connfunctions.

A. Motion Sensors Configuration
The markers or motion sensor configuration on the human body has many possible ways for motion recording one of them is shown in Figure -4.
In Figure -4, 41 sensors have been configured on the human body to follow the configuration scheme of CMU motion capture system and are described in the Vicon 512 manual [20].We analyzed the sensor labels of the C3D file by using 3MAX software.Keeping to these labels, we assessed the sensor values after editing the mcreadC3D function according to the markers order of template (see .Data keeps maintaining some dumping markers during the motion capture session of human activities such as walking, running, dancing etc. www.ijacsa.thesai.orgWe modify the mcread C3D function by using the following steps:

1) First access the index of motion sensors which are placed on the human body by using the code (Appendix A.1. 1):
For example, accessing the L_finger index of motion sensor from C3D file of CMU database (see Appendix A.1).
Similarly have accessed the other remaining 40 or 41 labels from file.
2) Some extra markers have stored during the recording we handled and assigned zero values; and adjusted them with marker labels.
3) Many mocap data, scientists use C3D format.They feel hard to export and construct raw motion data to the desired model.They have unmatched position between the calibration motion sensors.These positions are unable to make human model.For this matter, we used a mapar.conn(); and configure according to public mocap data.It is configured by using the mathematical methods.They are as one to one and one to many (see Figure 5).The following function parameters can be driven (see Appendix A.

A. Human Body Segment Parameters
Body segment parameters perform an important role to generate motion of human activities by motion capture systems.The mcgetsegmpar () function parameters of the Mocap Toolbox use the Dempster computed BSP data [32].We replaced BSP parameters [32] of mcgetsegmpar () function to Zatsiorsky and Seluyanov adjusted by the de Leva [33], and their computation is [34].For instance, an example of the BSP computation of specific human body part can be seen Eqs A.1 to A.6 in Appendix A.1.3.These parameters and 3D motion www.ijacsa.thesai.orgcapture data used to visualize the standard human skeleton with 20 joints (see Figure 7).

B. Mocap Data Normalization and Visualization of Skeleton Refinement
In order to investigate the marker"s positions of C3D file and their indexing, we configured the mapar.conn() and japar.conn() function according to indices.The source code of markers indexing configuration (Appendix A.1.2.).
The m2jpar() depends on mcm2j() function.It has the knowledge to compute the translation from markers to joint representation.The joint position has computed by applying the center method around placed marker on human body joints.For instance, the root position of joint is found-out between the 22,23,2 and 20 marker (markers marked with blue ellipses see Figure 4).Similarly, other joint positions are computed.We label each joint with specific name.They can be seen in Table 1.The following function uses to initialize required joint parameters.

japar = mcinitanimpar;
It contains information that will be helpful to initialize joint parameters and assigns the attributes of japar () structure.One of the fields of this structure is to edit the parameter by putting joint index of m2jpar and make the connection between these indices by applying the faithful methods (see Figure -5).These functions are used by applying japar.conn() Parameters can be accessed (see source code in Appendix A. 1.3) 19 20]; Expect of two joints 11 and 1, used the one to one function definition.The following function visualize the skeleton having 20 joints (see Figure 7), mcplotframe (walk2j,180, japar); The following function performs the animation pose of extracted skeleton of the C3D file from CMU database.It also has ability to create animation of the mocap data after editing some mocap Toolbox functions.The Figure 8 is created by using the mcanimate () function as, mcanimate (walk2j, 15, japar); (4)   After modification of toolbox functions, it can be enabled to read the C3D file successfully from public 3D mocap databases.Some of the existing function performances will be discussed in the section V.

V. RESULTS & DISCUSSION
In this paper, we will improve the Mocap Toolbox by introducing new script and editing some functions, it can be used as input mocap data such as CMU, and ACCAD (Advance Computing Centre for the Arts and Design) databases.Earlier, Toolbox demonstrates 28 sensors and formed a triangular mesh human model and extract a skeleton (see Figure -9 a & b) by applying the structure of connection matrix.It had been collected by using the Qualisys motion capture system.After that, we used public mocap data (C3D files) and found the fundamental errors (see Figure 1).This shows that data could not support because of more than 28 sensors.
Meanwhile, the public data has at least 41, or 42 markers, it is as standard form but more than that such as 80, 90, and 356 create the ambiguity and difficult trouble to understand in many research fields.This issue can be handled by adding the code into mcreadC3Dc3d ().This code would be seen in (Appendix A.1.2.).We modified a list of functions of the Mocap Toolbox according to CMU and ACCAD mocap database.The list of functions is as mcread C3Dc3d (), mapar.conn(), japar.conn, mcm2j() (), mcgetsegmpar (), and m2jpar ().The visualization of placed markers on the human body and is transferred into skeleton by using some editing functions.It is illustrated in Figure 10.It shows the effects of the edited functions.The rest of functions also can also be used for public mocap data.Some examples are as follows: As shown in Figure 11.(a), the successful result of animation poses of the human mesh skeleton, between markers interconnection, which complied with all required parameters in function (8). Figure 11. (b) shows the successful result of animation poses of normalized human skeleton, which extracted from 41 markers, and compiled all required parameters in function (9).The parameters (walk1 and mapar) deal with placing markers on the human body; and they (walk2j, japar) deal with human joint positions.The following functions are helpful to assess the accuracy of placing sensors on the human body during the motion recording sessions giving the information of missing markers, which are very useful for researchers.(See Figure 12).In  The rest of other existing functions in Mocap toolbox (framework) can be applied on mocap data, and some of them can plot with respect to time as a mcplottimeseries function.For example, hip joint sensor with xyz coordinates missing frames, the missing area marked with a circle (see Figure 13) information is related to female subject no B20 in ACCAD mocap database.We compared some of the above functions (see Table 2) to early version of the toolbox with respected to response time under a platform.We performed this evaluation on two public mocap databases and existing data of the toolbox.The details of these experiments are presented above.We have performed all above experiments under some specifications such as 2 GB www.ijacsa.thesai.orgRAM, Intel( R) Core™ i5 CPU M520 @ 2.4 GHz Dell i5 Intell core 2 CPU 2.4 and window 7 ultimate 64-bit and MATLAB R2012a.In Table 2, fourth column contains two colors red and blue ellipse.The red denotes the edited functions and blue indicate early function response time.Finally, we concluded that red has taken more time to blue because these function use public data but opened for everyone who want to test mocap data for multiple applications.The conclusion is formed in Figure 14 with functions time response.

VI. CONCLUSION AND FURTUR WORK
In this study, improved version is used partly due to the inherent structure information from Mocap toolbox.The key assumption of this improved version is that there is a high probability of using public mocap data.It is in native C3D file format.It has rich and plentiful information of human, which offers various types of numerical treatment.In future, we hope that it will be allowed for a prosperous variety of functions used for motion capture data for numerous research fields like animation, human joint analysis, gender and human identification (bioinformatics), retarget motion in real time environment.
Our improved version is proved to be effective in finding human mesh and stick skeleton model from public data as compared to capture with previous development platform [34] because it support only 28 3D markers.These skeleton models will be useful for joint movements" analysis and multiple purposes in indoor and outdoor environment.In addition, it gives the information about the quality of mocap-captured data (see Figure 12,13).

Fig. 1 .
Fig. 1.(a) Indicates the errors to in reading the C3D file, before introducing a script.After removing, the errors and we still have a tremendous data problem such as frame issue, (b) existing setting markers position in function.

Fig. 2 .
Fig. 2. Flowchart of the improved version mocap toolbox II.MOCAP DATA SYSTEMS Motion Capture Sensor Systems (MCSS) record the MOCAP data.Such systems are highly sophisticated, and require a certain number of motion capture session during subject activities.They are classified into two categories (1) Optical and (2) Non-optical Capture Sensor Systems.They provide different type data format.They are *.C3D, *.BVH, *.txt,*.tvt,and *.ASF/AMC.
It depends upon several types of hardware platform issues such as DEC (Digital Equipment Corporation), SGI/MIPS (Microprocessor without Interlocked Pipeline Stages)) and Intel.They represent the different floating-point numbers and are stored accordingly in hardware (VAX-D, IEEE-LE and IEEE-BE Vicon call them a "C3D_VaxD2PC".It establishes the strong connection between MOCAP TOOLBOX function and data.It is used before the mcread () function, its syntax as Output =C3D_VaxD2PC ('Convert',' data location');

Fig. 3 .
Fig. 3. Procedure of processing public domain mocap data for enhanced MOCAP TOOLBOX As shown in Figure 1 (a), an error has resolved by emitting this function.The graphical representation of the database can be seen in Figure 3.

Fig. 4 .
Fig. 4.An example, markers, or sensor configuration scheme on the human body according of CMU data

Fig. 8 .
Fig.8.An example of specific animated pose by using the function(4)

Fig. 10 .
Fig. 10.(a) & (b) describe the sensor connection through mapar.connstructure field to construct a mesh human skeleton and human stick skeleton extracted by using mcm2j function is used to transform these sensors information to the joint positions;(c) and (d) proved the edited functions that explained in section III, IV newpar1 = mcanimate(walk1, mapar); (5) newpar2 = mcanimate(walk2j, japar); (6) These two functions (5 & 6) executed successfully on mentioned data, and demonstrate the following animation Fig. 11.After modification functions results (a) marker positions of animation poses of the human waking results, and (b) normalized human skeleton of walking animation poses of CMU mocap data

Fig. 13 .
Fig. 13.An example of hip joint movement missing frame