1. Data-analyzing program
Last time we have brought progress to our data-analyzing program - we included waveform diagrams showing the trend of the change in input data from the eight sensor ports for further analyzing work.
These days we've been working on the methodology of music function triggering. Corresponding to our testing result before, area is one of the major factors affecting the received data value. Even with the same gestures carried out, the received values could vary a lot for different people as they have different hand shapes and masses.
As a result, we switched from our initial idea of direct data value analysis to a whole new approach - to differentiate the input data waveform under each small time period, so that the relationship of slope against time is found. This is useful as in this case we are observing and analyzing only basing on the actions of the users - whether they are approaching or leaving the sensors, whether they are generating a fast or slow beat; all these cause changes in slopes and the analyzing of the slope trends is accurate even with different users and different hand shapes.
Our re-developed UI
2. DJ table UI with several completed music effect
Meanwhile, we're also developing the user interface for the DJ table. At this stage we already have a simple interface that allows users to load in (.wav) songs from their local directories. Users can adjust the volume when playing the song. We also provided users with six parameters to adjust (Dry, Wet, Feedback, Sweep Rate, Sweep Range and Frequency). We also included some rhythmic music pieces to be remixed with the played music for a greater entertainment effect.
Our DJ table UI and the programming code
We will continue to work on both areas to develop the sensing methodology and improve the functions of the DJ table. Fighting!
Reference: Programming Audio Effects in C#