Saturday, December 7, 2013

Tutorial with Yee Xiang

I had so stressed on my new semester because i couldn't solve the problem in my project. And with some personal issue, i had run out of money to pay the college fees. I am so scare to come to the college every day with my terminated access card. Finance department keep ringing my phone for the payment and i have no person to seek for to paying my overdue college tuition fees.

After spoken with Sweii and yeah, i relieve a bit. Its a good consultation session. Finally i get my business partner to help me pay my credit card while i used it to swipe and pay the school tuition fees. I know i will ow my partner a debt but its still ok i can repay them after i graduate.

And i had been so unresponsible for being absence from the previous tutorial section that i had with my guest lecturer.  Therefore, i had sent a apologise letter to my guest lecturer and hopefully he will forgive me and helping me to solve my problem. 


On the 6th of december i finally meet my guest lecturer for the 1st time, he is a very nice guy that willing to help to solve my problem, i am so regret i dont appear and the 1st class. If i came to the 1st section of the class, perhaps the "stresses" from the problem i had facing on my coding will just solve easily.


In the tutorial section, he considered my problem and give me a suggestion to simplify the coding as i only need the color zone instead of massive layer of coding. Beside, he also suggest me not to use those heavy graphic library to avoid the lagging issue.

Therefore, i take his advice and move on to simple coding method that is layers of circle.

Friday, November 15, 2013

Failure

All the while i had been trying with the super colider coding and i found that i having a major problem that is i cant find any kinect camera application/ library that can point straightly and link to super colider. im starting to feel so uncomfortable to developing super colider code and spend weeks for searching relevance tutorial and try to solve it.

At the end, i decided to give up using super colider as the coding platform and move back to processing for the fluid library and hopefully its will work well.

Before i linked into a kinect library, I had trying so hard with the fluid library and found that the graphic processing is very lag. It have a major lagging if i set up to 36 color, even i set 6 color without the sparking and flare, its also having a delay around 6 second.

Below is the specification that starting to lag:
Screen Pixel: 500x500
graphic pixel: 5

Hmm, then i am thinking... No way i could do this to a full screen if its so lag. I almost decided to give up on this library again

Friday, September 6, 2013

My Pitch session

Today is my pitch presentation session, i had come out with a presentation that i always use to it.
Its a video presentation, while playing the video, i need to adjust my timing properly and explaining the content whithin the proper timing.

Below is the presentation video i had made:

In the presentation video, i had record a little demo that how the height of the salt can triggle the kinect camera to export out the sound.

Friday, August 30, 2013

programme coding

This is the visual i had archived based on the super collider coding, i able to set in multitouch and the swam moving towards it. *the red colour is the multitouch zone.

The problem that i had faced is the multitouch option is not loop able, its freeze on a location when it detected the input only.

Sunday, August 25, 2013

Programme i will used

Kinect Core vision by nui group
its used to detect the input data and exported into TUIO format into other programme .













SuperCollider - a real-time audio synthesis and algorithmic composition

Tutorial can found in - 
http://supercollider.sourceforge.net/learning/
http://www.bartetzki.de/en/lisbon_sc07.html

I had make some try on the sound and processing coding,
this is the audio library test, working quite well to having the audio output.
but...

this is the screen shot of the processing apps that i had coded.the pixelated and lagged visual.


The reason i used supercollider is because the processing cannot support high-resolution graphic library rendering and the pixelated issue. and the supercollider design especially for audio and visual.

Saturday, August 24, 2013

Following is some of the tutorial i found for processing coding



Audio Processing http://www.ee.columbia.edu/~dpwe/resources/Processing/
- i had learn to do simulation sound using open sound library for a virtual guita tone.

SoundTouch Audio Processing Libraryhttp://www.surina.net/soundtouch/
- a open source audio library for generate and simulate sound for processing coding.
http://www.surina.net/soundtouch/soundstretch.html
- demo pieces for final audio

Review & project documentation from others
http://createdigitalmusic.com/2008/11/spaces-and-roots-manipulating-sound-with-processing-touch-tangible-interfaces/
- to explore different software
tbeta (“The Beta”): finger tracking. tbeta is an open-source, cross-platform computer vision and multi-touch sensing platform. It’s the successor to the former touchlib, which wasn’t as cross-platform or quite as awesome. More on tbeta on Create Digital Motion.
reacTIVision: fiducial marker tracking for objects. (Fiducial markers are these funny, cellular-looking patterns pictured at right that allow you to track specific objects manipulated on the table. reacTIVision is the open-source library developed by the folks who did reactable. Sounds as though we might get fiducial tracking in the other library, though.)
ChucK: a strongly-timed, quick-to-code sound and synthesis language. It’s elegant enough that it’s used for real-time programming – as in, onstage, in laptop ensembles like PLOrk and (its West Coast descendent we just saw here on CDM) SLOrk.
Native Instruments Reaktor: The modular sequencer, instrument, and effect builder, which we coverregularly on our Kore minisite. It’s the only commercial / non-open-source choice here, though it may actually replace ChucK on Roots in the future.

Basic Sound Tutorial for processing

http://www.openprocessing.org/sketch/41522

MIDI based audio demo application created by processing
http://webaudiodemos.appspot.com/

Demo using swarm space coding
http://www.creativeapplications.net/maxmsp/interactive-swarm-space-c-maxmsp/

GRID is used for my graphic reference 
http://www.futura-epsis1.com/project/GRID


arduino+processing+touch sensorhttp://forum.arduino.cc/index.php?topic=53477.0
- used to study the connection between 3 coding sync together.

Friday, August 23, 2013

Kinect sensor

Research for similar kinect sensor for a cheaper option.
Sensor from leap motion. only $79.99 but not suitable for my project usage.

https://www.leapmotion.com/apps









Some usefull kinect games information is available at here.
http://www.kinecthacks.com/


I had be using quite sometimes to search for a proper kinect camera, kinect camera with usb adapter is quite limited, i had walk thru most of the shops in KL and Selangor but most of the kinect camera i found is without the usb adapter, means i couldn't connect it with my mac to used it.
After quite some times i had decided to took Kelvin advice to purchase the adapter online from Singapore.
online buy it from singapore

finally its stack into a piece!

Friday, June 21, 2013

SOI submission


Incase its hard to view, please download on this link:

https://docs.google.com/file/d/0B1QKmKBS4-NSNEdOOTAwYlQ3dlk/edit?usp=sharing



Concept design for installation

Process to make a mockup design for the Statement of Intent
Done by photoshop 3d, but due to lacking technique so i done in black and while only.

Manually adding led colour.

Montages with some colour salt.

Logo creating- ins-pirated by salt bottle caps, and the salt particle crystalize shaped.



And the outcome

Friday, June 14, 2013

LED research

Basic knowledge on led strip


This video is teaching me how to do custom wiring modification on led.

solding

This video is teaching me how to do soldering on led strip


coding source code:
https://github.com/enyone/cvavr/tree/...


This is one of the programe that i had found to modifying the led colour, but its just on window not applicable on mac, so this will be my second option to choose it. Disadvantages, if i need to use this program, i need to bypassed 3 program to make my project work, so this is not so considerate as it will affect the processor speed and make my real-time lagging.

LED programming


Opensoursse philips led controller
http://loosen.home.xs4all.nl/elektronica/BobLight/
Boblight is a opensource program to imitate the philipps ambilight. It expects a controller with three individual LED outputs (left, top, right).


But if for arduino, some of the project is listed here
http://www.instructables.com/id/Ultra-bright-LED-Color-Changing-Spotlight-using-Op/





The device:
Cheap RGB controller to cut my cost XP
Source: http://hackaday.com/2012/05/10/reverse-engineering-an-rgb-led-remote/

Source two: http://blog.allgaiershops.com/2012/05/10/reversing-an-rgb-led-remote/

This article is using hacking ir controller instead of wifi controller that seems cheapest amont the hardware i found. So this is my 1st option to do it, if i can custom it successfully so it will be an advantage for me to continuing the following steps - kinect camera setup.

Tutorial sheet for 14 June 2013


54321 poster presentation


Friday, May 31, 2013

Second class, more deep development

From the last session that i having the tutorial, i found that i can taking kelvin advice to merge some part of my 1st idea with the 3rd one as both of them have some interesting element on it.

I will carried on with music installation, following the an example of the music installation called Communion from 

Communion from FIELD on Vimeo.

We can see this installation is animating the communion and essentially evolution them to small creature. This video is not really about interactivity but i intend to show it because of the expression of the graphical element sync with the music to create a larger impact. I'll truely believe this kind of installation can bring a more emerging and interactive experience to  the audience that why I still going on with the musical & light installation idea.

Ste Justine musical wall from from  also shown how musical installation can get interest from audience and make them interact with it.

Ste Justine musical wall from Moment Factory on Vimeo.


In this project, I wan to make a more artistic musical installation that exhume people underlaid musical talent that can place in any exhibition/ event hall.

In the journal of Kindling the Spark: Recognizing and Developing Musical Talent by Joanne Haroutounian, Ph D talk about how they develop talented musiciant by making people imaging the further depth of the music without a proper rytheme and developed with it.  LINK

Joanne also quote the following from Seashore too:
"According to Seashore: “After a comparatively early age, these capacities do not vary
with intelligence, with training, or with increasing age. .. It makesthe diagnosis oftalent
possible before training is begun and pointsto certain very definite principles of music
education. . It isthe meaning, and not the capacity, of these forms ofimpression which
we train and which matures with age in proportion to the degree of intelligence and
emotional drive. “ Carl Seashore Psychology of Music 1938"

In this article Musical Talent: Innate or Learned? by —Julie A. Wojcik, M.Ed., NCSP
She had said:" Researchers recognize such indicators of precocious musical talent as an innate ability to identify pitch" She had claim that the musical talent is innate and only required a exploitation.

Music, Mind, and Meaning by Marvin Minsky  Chapter one - Why Do We Like Music? had mention nowadays people can play music because of environmental influence, somehow like if we can sing means we can play music and we having the musical cell in our body .Beside, the article also mention we having a culture immersion every few hour once we having radio or musical song around us and touching and influence our emotion to absorb the music "particle".

So, In the few point above i believe that my musical installation idea might be really works to show out the interested from the audience/ player. Maybe not everyone can play a nice music, but atleast in this culture almost everyone of us could have the interest on music element and having a high percentage that they could play with it.

In the installation that i planned to do is include the usage of colour changing bulb led that i had shown in the last post. With the help of the ambience lighting effect, it can affect human mood and social behavior.
Different kind of light colour also having different effect, in the press realease from PNAS- The color of ambient light influences our mood had briefly talked about how the brain had works and affect our mood when we are in different lighting, and create more imaging when in the certain lighting environment.


Interactive Main Stage @ Trailerpark Festival 2011 from  had created a light interaction installation control by movement of the hand tracking by kinetic camera.
Interactive Main Stage @ Trailerpark Festival 2011 from Dark Matters on Vimeo.

This is the visual that my project will like to be:
Scan Processor Studies (excerpts pt.1) from Brian O'Reilly on Vimeo.

Another same interface that i will like to use as my installation, i found out this kind of interface called Sandscape
http://tangible.media.mit.edu/project.php?recid=36



for the sound part, my installation can feel like this, but in wave form.

http://www.earslap.com/

Open source apps/coding.

SuperCollider is a free program from http://www.audiosynth.com/. used to composed synthase audio with graphic. But theirs a plugin call Hadron to add on to visualize the simple graphic and setup the audio.

Tutorial could be found at http://supercollider.sourceforge.net/community/
and
http://www.newscores.com/scforum/
and 
http://electro-music.com/forum/forum-139.html

Gephi, an open source graph visualization and manipulation software. 
Available at :https://gephi.org/

Tutorial available in
https://gephi.org/users/
in wikipedia also
http://wiki.gephi.org/index.php/Gephi_User_Manual

In this link below, it also show 5 free open source program to create realtime video synthesis。
http://www.gutsblow.com/Archive/12/5-free-open-source-apps-to-create-mind-blowing-motion-graphics


1.Processing
Before you close this page, thinking that this post is nothing but a list of scripting/code related apps, I have to say that Processing is by far the most powerful yet easiest set of tools that allow to create wide variety of graphics from Interactive Kiosks to iTunes Visualizers.
Even though the learning curve is a little bit steep, especially for a person new to coding, it is more like a fusion of AE and Flash. It has the dynamic nature of Flash and the looks of After Effects. By default, it comes with numerous libraries including sound analysis. It is being rapidly adopted by various studios and institutions and has a very loyal user base. Check out their amazing Gallery. There are many books written on Processing to get you started right from the bare basics, but the Learning pages in the site are quite helpful too. Processing is a Cross Platform Open Source application distributed for Mac/Win/Linux.

Actually Intended for developers, Quartz Composer(QC) is very famous for its simplicity & ease of use. It is a GPU accelerated visual programming tool that helps you create very dynamic and cool Core Image filters,Animations,Transition effects or iTunes Music Visualizers.

Honestly speaking, this Application left me spell bound when I first discovered it. This is the best example to show Apple’s quality and simplicity in their Apps. It is widely used by VJ’s all over the world along with third party Apps like VDMX. Quicktime & Safari have native capability to preview QC files, which means you need not have Quartz Composer installed to preview QC Compositions. Also, Quartz Composer allows you to create OpenGL shaders easily without having it to do in a low level programming language. But, here comes the most interesting part, Kineme has numerous patches and plugins which makes Quartz Composer a complete toolkit that analyzes audio, produces particle effects and Even Imports 3D Models and manipulates them. Kineme also has an offline renderer which produces highly Anti-Aliased renders, thus opening numerous possibilities for Motion Graphics. This application can do many things and it is truly impressive. You can create your own plugins for various Pro Apps like AE, Final Cut, Apple Motion using various 3rd Party Tools like Effect Builder AE , Fx Factory Pro , CHV-Plugins etc., There are many tutorials on Vimeo to get started and also the famous Rayz-O-Lite tutorial from DVCreators. Unfortunately, Quartz Composer is Mac Only.
Also, check out Zugakousaku’s Amazing Quartz Composer Gallery

VVVV is a toolkit of realtime video synthesis for Windows, very similar to Quartz Composer on Mac, but it is way powerful. It has a wide variety of features right from 3D Mesh manipulation, Audio synthesis, Realtime video synthesis. Contrary to Processing/Flash VVVV has a reputation of being used for Super High Quality Video Synthesis in Real time. There are many projects where visuals are generated at resolutions more than 4K at 60fps. Struktwidely uses VVVV for many of their projects. 

Also, you can see amazing visuals in the Vimeo channel. On the downside, it is Windows only and it has a steep learning curve with limited resources. But the quality of the visuals produced by this toolkit surpasses all the other generative applications in real time. VVVV is free for Non-Commercial Use.


eMotion is primarily designed for text and particle based effects for both realtime and offline synthesis. Even though this Application is still in Alpha stages, it produces amazing typrography videos. eMotion’s Text and Particle tools kick AE’s and even Trapcode Particular’s ass. It’s force based physics simulations are way ahead of any particle system I have seen so far. Moreover, all these effects happen in real time and can be customized with wide range of settings and scripts. As it is in Alpha stage there might be some problems with the interface and usability but it is by far stable. If eMotion gains popularity, it will sure become a must have tool for any text/particle based effects. eMotion is free to download and Mac Only. There are some good tutorials on Vimeo to get you started.

5.Nodebox
Nodebox is another application which has a similar set of tools like Processing, but it is Python based. It is very easy to use and is actually intended for Graphic Design and Animation. It has many advanced features like built-in PDF, Quicktime,Fonts & CoreImage capabilities. There are many number of libraries available for download, and there are many resources to help you get started. But, it is widely believed that if you know Processing, it is easy to get around in NodeBox. Right Now, the stable version of Nodebox is available for Mac Only , while experimental versions are available for Windows and Linux.
These are some of the Applications that help you create really high quality visuals without using regular programs like After Effects or Motion. Also if you are interested in such applications, check out Subblue’s blog post on Fractal and Generative Art resources where he covers many other resources which aren’t related to motion design. Also I didn’t include Blender as a lot of people already use it and it is primarily intended for 3D design.
If you know any other Apps/Toolkits please list them in the comments, I will be happy to include.




If i setting up my installation, i will include the led light as my ambiance light to control environmental experience, the light will goes around while the graphic change the color. This make the user have the capability to archive their own mood.

http://geekologie.com/2012/09/lifx-the-color-changable-wi-fi-enabled-2.php


The light mood can change when the music changing around.

And theres few way to hack the led light with some technical customizing.
Tutorial had shown below.
http://lifehacker.com/5941916/hack-a-rgb-led-strip-and-control-its-color-over-wi+fi
http://www.shatteredhaven.com/2012/12/1379365-colorful-lights-hacking-philips.html
http://arstechnica.com/gadgets/2012/11/in-living-color-ars-reviews-the-hacker-approved-philips-hue-leds/
And there is a open source control panel that can custome to take control over the light colour and dim option.


http://www.geek.com/news/hack-an-ikea-led-strip-and-control-it-with-your-smartphone-1514661/
have a tutorial on Openpicus IDE software that help to hack led. Solding the wire and only cost around 4hundred in the led strip.
deeper tutorial available here:
http://community.openpicus.com/forum/software-ide-apps-and-libraries
and this page:
http://wiki.openpicus.com/index.php?title=IKEA_RGB_Led_Strip_Hack

Tutorial form for Idea consultation on 31 May 2013


Thursday, May 16, 2013

Idea and concepts

The 1st idea generated in my mind is to create somethings thats make smartphone really "smart".
Since we keep bring our phone around, why not I like to design an apps and device that could make the phone connected to our living environment, just by using bluetooth technology.
Such like:
The "smart"phone could detect that we are in the room with the bluetooth signal range, so it could activate the light and the set the alarm clock.
Second day,  the apps could switch on the computer once when get near our working desk, so we can save the time and it could pull down the data from our preset such like news and unread message from our phone and made our life easier.
And we can activate the fan/light/air cond when we are in the room and turning it off when we leave it.
So it will be more eco friendly product, in sense of saving electrical when we not using it.

My concept is somehow similar with this.


but the actual world design/concept would be shown like :


Bluetooth range technology is somethings like this

iCookie


and this
StickNFind



light switch



my inspiratin is from iron man, if a "smart phone" that is smart enough to know your location and position to activate somethings from you





okay, second idea, getting inspired somethings such like mentally disorder -Schizophrenia kind 精神分裂症。

How's making a concept of person with assist of ai, such like robocop and android 16 and 17 in dragon ball. crossover with Schizophrenia disorder
Schizophrenia disorder talks and manipulate character inside the mind, they will like a extra person that can assist and guide you or making horror for you.

A chip implant that can insert in our brain and power by the same method that how the blood power up our cell,its created illusion straightly in our mind and generate an ai in the mind to talk and help us to gather information, maybe could grabbing wifi signal or tv signal and straightly watch the show in our mind. Moreover, we can just google somethings in our mind asap when we need it.

My design will almost look like this
Sight from Sight Systems on Vimeo.
but not exactly same as i would prefer it have some "Jarvis" similar element that from iron man."keep on chichating conversation"

This is somethings call "second brain" perhaps?