In this section you can find all the information related with my research experience.
Here I present a quantitative summary and a summary of research topics. Please, navigate through the different sections below, or click on the links throughout the summaries, for further details.
Publications
Projects
Technology transfer
Reviews and committees
My student's thesis
Demos and exhibitions
Open source releases
Other
Our physical model of the hand allows natural grasping (thanks to friction simulation) and complex manipulation of rigid and deformable objects (left image). The model allows simulation of subtle effects such as the non-lineal deformation of finger tips (center image). The internal articulated skeleton of the model is coupled both to the haptic device and to the external flesh envelope of the model (right image).
A few examples of deformable tools with haptic feedback using our Handle-Space Linearization (HSL) method. The method allows for arbitrary placement of the handle of the tool (in red in the left image), and solving intricate contact (center image) including self-collisions and large deformation (right image).
Left/Center: different examples scenarios of internal anatomy palpation (from inside and from outside); due to the strong coupling between the anatomical parts, the deformation of a single muscle or ligament might propagate to other coupled elastic parts, and this coupling must be solved while running the haptic thread at 1KHz. Right: current state of the training simulator, now acquired by Simbionix, in use in hospitals worldwide.
From left to right: real scenario where the user (wearing a glove) manipulates the haptic device; visual recomposition of the scenario without the haptic device and the user hand; visual recomposition where the virtual objects (including the tool) have been added; full scenario seen by the user in real time, where the visualization of his hand is added with a transformation showing proper handling of the virtual tool.
Scanned turbine model with different occlusion handling techniques applied, such as exploded views (left) and adaptive transparency (center). Information can be linked to the geometry through the use of text labels which layout is dynamically adapted to the viewpoint (right).
Left: test application for our Ambisonics-based spatializer for loudspeaker systems. Right: example of integration of our HRTF-based binaural spatializer with Unity 3D, running in Windows, Mac OSX, Android and iOS.
Test application for our binaural spatializer including hearing loss (top right) and hearing aid (bottom right) simulation. The hearing aid simulation model allows automatic configuration based on real audiometry data, using the standard FIG6 method.
Left: architecture of the simulator, consisting of a Host machine responsible of user interface including graphics render and real-time Target machine running a co-simulation with different models (functional and multi-body) for each vehicle component. Right: screenshot of the simulator, showing the detail of some of the internal parts of the multi-body model (in different colors); the full multi-body model contains 78 parts simulated in real time
Some examples of results of the benchmarks, which compares real-fast simulations (left image, showing the graph for Xenomai with CFS scheduler) and real-time (right image, showing the graph for Xenomai with FIFO scheduler) depending on the simulation complexity (number of nodes) and the number of parallel threads.
Left: Detailed FE model of a honeycomb cell used in lightweight panels. Right: equivalence between a full honeycomb sandwich panel and its approximation with the equivalent stiffness method used in our concept model.
Left: Detailed FE simulation of crash against a U structure. Center: the same structure simulated with the equivalent mechanisms concept model. Right: detail of implementation of one of the many components in our equivalent mechanisms library.
Architecture of the multi-rate framework, which connects multi-body models running in VirtualLab Motion with functional models running in ImagineLab Amesim at different rates and in different machines. The framework provides an API to allow interconection of other software tools and implements different multi-rate interpolation schemes.
Left: global view of the implemented control architecture. Right: Comparison between the human reactive behavior and the reactive layer of the implemented architecture.
The teacher application allows the teacher to setup the hardware structure (memories, registers, ...) and the instruction set (instruction prototype and microprogram for each instruction). The application allows the teacher to generate random homework for each student with one click and to automatically evaluate the student's work. The student application (in the image) provides the student with a graphical microprogramming environment and a self-assessment tool. The first version of the application is focused on teaching microprograming skills.