This project was born after a complicated orchestra rehearsal due to the absence of musicians. The idea came to me that virtual instruments could compensate for the absence of some musicians, particularly those on whom the others are based. But how synchronise virtual instruments during rehearsal?
The simpliest way is to start from the conductor who guides the musicians. By developing a connected baton able to detect his movements, it would be possible to transcribe and interpret this information.
Regarding the sound system, it is better to play the sound in the middle of the musicians so that they find the sound environment to which they are accustomed. The idea is to place bots at the location of each absent person, with each bot playing a specific voice.
The current solution is based on a network of interconnected devices:
The project work as it is, however I would like to continue to make it even easier to use:
The signals interpreted by the baton can be relayed to the visually impaired via a haptic device. Various theses on the subject show that such a project required a bit of engineering to produce a simple and effective device for th user.
A conductor asked me how such a system could be used to train conductors. The conductor would be able to practise in complete autonomy while enjoying an interesting sound experience. It could also be used to evaluate the conductor's performance and highlight areas for improvement.
As initiator and leader of the project, I carried out a feasibility study based on video analysis. This provided me with a set of data giving me the characteristics of the signals to be processed (acceleration, etc.). From this, I deduced the computing power required, and the components needed (microcontroller, accelerometer, gyroscope).
Still working alone on the project, I have create a initial prototype combining electronics and computing to propose a solution running on a simple computer, the baton and its receiver.
In a second version, I opted for a solution based on the Adibox, allowing an improvement in performance and a more appropriate sound diffusion via the implementation of bots. This work was carried out as a team.
You can find more information about the project on the website Adimuse-Labs (website for Adimuse innovations).
I had robotics additional courses in my studies at ISEN which were very project-oriented: we worked in four students' teams, and we had a year-long project. The aim was to understand the material we are given, adapt it if we need, and make the robot meets the specifications.
Our challenge was to realise a car with mecanum wheels controlled by an AI. Our robots had to move through a maze without bumping into walls or other robots.
Mecanum wheels are wheels with small wheels on its surface offering new possible movement: lateral movement, rotation on the spot, diagonal movement, a turn while keeping the wheels straight.
In the process my various projects, I had to model a lot of parts in 3D, from simple parts like mouthpieces of my PVC flutes, to complete assemblies like for the design of my modular multi-neck guitar.
I mainly use two tools: SolidWorks and FreeCAD.
On exemples of my 3D creations is a sixth finger, build from a sillicone finger I made during my internship at INRIA. I built a frame to attach it to a human hand.
During my last year at middle school, I had the opportunity to do two internships, especially one in the laboratories of the Defrost project team at INRIA (French Institute for Research in Computer Science and Automation).
This team work on a simulation tool for soft robotics.
Soft robotics is an innovative approach to robotics based on the deformation of materials. This approach has several advantages:
Soft robotics is based on flexible structures complex to model. It requires more calculations and good knowledge in mathematics.
In rigid robotics, it is solids which move between them, there are few possible movements, and they are easy to model.
In soft robotics, it is solids which deform, offering a much wider range of movement and an ability to adapt to the external environment. This explains the difficulty in modelling it.
When I was at middle school, I could get started to robotics thanks to the Mindstorm robot, it allowed me to make various small robots to understand better how they work.
At home, I practised with the challenges offered by the lego Mindstorms robot.
I then started to create an articulated arm whose motors are offset from the arm to make it lighter. Here is a simulation: the robot follows a movement made with a finger on a tablet.
I use Arduino board in different projects, like my electronic bagpipe or my project in my additional courses of robotics.