Wednesday 15 December 2010

Idat 204 - My killer application.

http://busy-signal.co.uk/mediaindex.html
!note: It does take a while to load and may require a refresh!!


My killer App

The current web is designed and guided for human readability, and requires human interaction to perform any desired task. The semantic web would take the web to a new form of interaction, by presenting data in such a way that is readable and understood by computers as well as humans. This would allow to machines to do such tasks as searching without a human operator. The semantic web does not change the web as we know it, it extends it. By adding new forms of data and tags in the form of XML to documents that become readable by humans and computers.
The problem

Our society is now a media frenzy, and has been for the past 10 years. With new technologies becoming available it has led to an overwhelming volume of content, 24 hours of video are uploaded to YouTube every minute alone. With this in mind it can be a hard and lengthy process to find the exact media file, be it audio, video or text that you are looking for, especially if you don’t know the title or producer. For example, if you are trying to explain to a friend a film you have seen but no idea of the title, it's obviously going to be very hard to share the media you wish too. The same problem is true with information in general, although Wikipedia is considered by many to be the main index of information on the web, my application is of a similar service but for media related files.
The solution.
“A killer application that automatically tags current and, future media content. “

My application would use software to scan any media uploaded to the net via a unique API. This API could be included in any upload service on any site such as Youtube. After the media file has been scanned it would convert any dialogue into searchable text. The person who uploads the file can then tag the file via unique tags. In addition to this any viewer can add tags to the selected content creating quicker and more efficient searches. So instead of tagging title, genre etc which is possible on most video sites now, the person will tag things such as location, actor name, song pitch. Therefore my application is not a database it is a search engine.

Basically the user inputs a selection of text, be it an exert from a song, film or book and selects the media type to narrow down the search. So if the user has typed in a line from a song such as then we kiss”, upon clicking search the software will scan the user generated tags and XML document's of the media for the related text. Based upon this search it will select what it believes is the most relevant video, song or book depending on what media category was selected on the first screen. In addition to that it will start the media file at the exact point in which it appears. For example in the second screen, screen-shot after having typed in “then we kiss” the video has started at 0.18. It knows to start here via the XML document which has enabled the file to become searchable text.

So here is an example of the XML of any given media file.
Produced By: Oliver Koletzki Title: hypnotized


<?xml version="1.0" ?>

                                <title>Produced By: Oliver Koletzki  Title: hypnotized</title>                     
                                <author>Media Type: Music</author>
                                <time>Time: 0.18</time>                           
                                <message>

I catch your eyes ,try not to smile
I track your style I feel your vibes
We have a drink then go outside
Talk for a while and then we kiss

                </message>
                </post>


Starting to make Media Index

So as you can see I have mocked up some basic images of how I want the interface to look. I have kept the design minimal yet aesthetically pleasing.
 The reason for this is due to inspiration from Google’s incredible success by creating an easy to use service. To make this prototype functional I will be using Flash(Actionscript 3.0) and XML. So like many others I set off to gain a deeper understanding of both languages via tutorials.
While the tutorials I found helped, I quickly learnt that to show the full potential of my prototype was going to be hard. Pulling in video via XML was a very tricky process so I kept it simple by having the video embedded in the flash file. The only drawback of me doing it this way, was that I now only knew how to import a singular media type. The original plan was to have an example of a Book, Song and Film all being pulled in from XML. However the chosen file fully demonstrates how my product works.






So as you can see from this screenshot the design has changed slightly. Instead of a drop down box I now have radio buttons which called Title, Media type and message. The user will select each one depending on how much information they have about the media file they are trying to find. For example, if they know the producer of a film and a small passage of dialogue from it they will select those two buttons and search for one of them. However all three can be selected to search all of the tags contained in the xml. The result will depend on which buttons were selected and, if nothing is typed or the search request is less than 6 characters, the user will see an error message. I have done this so that my application gets the best search results.

The result box gives the user vital information on the file, Title, Media type and timeframe. The timeframe result is key to my application. The result will depend on what the user searched for. So for example if the words from a song are at 2 minutes and 20 seconds into the file the video will start at that point. My application knows to do this because of the unique Time tags given to each file automatically.
I also realised that any given search will more than likely give multiple results, especially if the dialogue is the same for two different media types. E.g. A movie adaptation of a book. With this in mind I have added a ‘Nearest Media type’s’ section. A lot like YouTube’s suggested videos, but mine contains all three media types from many different sources.

Why is Media Index Semantic?
Currently computers can only search media for title and certain tags that a human gives it. Tagging does make for an effective search so that will remain in place with Media Index but a bit more fine tuned, So as mentioned new tags will be added automatically such as location, actor name, or song pitch.
On top of this my application relies on software not available yet. It will convert any media file into a searchable text document the very moment it is uploaded to the web. So if a user uploads a video to YouTube, the software will scan the data and create an xml file searchable via my application.
However nothing will be hosted on my site because that would make it a database and that is possible now. Instead my Application is a search engine for media files, like Wikipedia is for information.



What makes Media Index a Killer app?

A killer Application to me is something of which dominates its market and makes money. My application will do both. When the user gains the desired result e.g. a music file, links will appear that will take the three cheapest prices of the product from three different sources. This is something YouTube has incorporated and is making revenue by providing a link for the user to be able to purchase such media files. So by sending vast amounts of traffic to many different sites, it will provide sales for each, therefore my application will take a certain amount of the money from each sale.

Conclusion

Here is the final and finished prototype:       www.busy-signal.co.uk/mediaindex.html








Thursday 22 April 2010

Conclusion.




So our project works, and we are very proud of it. We set out to explore stonehosue in a different and advanced way, and i think we've been successful. Everything we wished to include in our project we have succeeded in doing so. Whilst there are some things we could have built better upon such as a more complex animation, overall our group has done well to achieve the final product. I feel the technology we have used has been interesting and intriguing to use, and very cost effective. 

Personally i have to admit that considering that when i started this module i was confused and baffled as to how we would achieve what we have or even what we were really trying to do in the Stonehosue area, but now i see clarity in the use of different forms of technology and my way of thinning has changed significantly when approaching creative assignments such as this, something i think this module try s to gain from us students.

As a final thought i do hope to take our project down to the Dance Academy to promote the building itself as i do hope something is done with a building with a rich history. 







Monday 19 April 2010

Frustration, sucess and animation.


First things first. I wanted my tornado (ellipses) to start off at a certain point so to do this  i declared 3 floats as mentioned in a previous post, i then gave these floats the value of 255,which is the location on the xy scale where they would start.


float z = 255;
float x = 255;
float y = 255;


and the stroke was simply done


stroke(#d504ff);


I then realised that i should construct my animation itself. Meaning what did i want to happen depending ont he wind speed. I looked at the data and saw it was in this mathematical format 3.12. So i was going to use the whole number for my animation, which meant using an IF statement.


int q = int(title[i].getContent().charAt(0));
int w = int(pubDate[i].getContent().charAt(1));
int e = int(value[i].getContent().charAt(2));


I used 3 random letters so that i could use the value of q, which was being pulled in through the xml feed.


so i started with...


if (q <= 210) {
text ("Windspeed is slow, have a nice day! ", 80, 200);


else {
text("Windpseed is high! Take it easy out there!", 50, 150);


Now came hours of research and tutorials..


fill(x,z,y); - This fills my ellipses with colour.
stroke(#d504ff); - this gives it the pink storke outline.
Two very simple lines for code.
ellipse (abs(x+=9*cos(z+=.5-noise(i++)))%L,abs(y+=9*sin(z))%L,40,40);
However this mean looking line required lengthy tutorials via www.learnprocessing.com and www.processing.org.




So using this code it moves the ellipses around the sketch. Ive revised the functions used in this line of code and I do understand why they are sued and what they are for however I dont think id be able to write it off the top of my head. Still in gives me the effect I wanted in my processing. However to get it to move in certain ways I have to change the values of x,y, and z in my IF statement.


text ("Windspeed is slow, have a nice day! ", 80, 200);
z = z - 1;
x = x - 1;
y = y +1;
}
else {






text("Windpseed is high! Take it easy out there!", 50, 150);
x= x - 5;
z = z - 8;
y = y + 5;


}


The numeric values are random numbers I chose, this proved to be very temperamental as the slightest change in the number can result in a completely different animation, as it changes how and where the ellipses fall. It also changes the speed of which they move around the screen, I simply did trial and error with this values till I found an effect I liked for both parts of my IF statement.


Lastly I thought I would explain this..
filter(18);
filter(11);
The filter function is a very artistic function and gave me the wave/ripple/cloud effect when the ellipses move. It distorts the image or in my case my ellipses to create a tail of colour. The numeric value of filter can also be very temperamental however this is a good thing as it makes for very varied animations, and I think it matches my specification perfectly.


So conclusion? Im proud of my work. At the start i hated processing and could not get to grips with it. However tutorials like with most things are a god send! The process was frustrating as I had to learn further before I could really start my sketch, when I just wanted to get it done. There is obviously a lot I still dont understand but I feel through this assignment ive covered the basics and more. As a note its much easier to sign the applet on a mac terminal window as opposed to windows CMD. This process was made simple thanks to Chris saunders. As well as Chris a big thanks goes to www.learnprocessing.org as it has been a huge help!

My processing sketch is available on http://busy-signal.co.uk/processing.html


The setup!




In my setup i knew i wanted to declare a background colour and use a custom font.


Declaring background colours is a simple task and requires one line of code. Using a font however is different.


void setup() {
PFont font;
font = loadFont("StormExtraBold-48.vlw");
textFont(font, 32);
text("word", 15, 50);
textSize(14);
text("word", 15, 70);
smooth();
size(400,400);
background(#77b2f5);

 Here is my setup. In this code i have imported a font that i have installed via dafont.com, and after declared its size/scale and declared where on my 400 x 400 sketch i want the text to appear! So at this point my sketch was importing data from the XML feed, and the setup of my sketch was complete. Now came the hard part the animation!

Wednesday 14 April 2010

The sketch takes shape!


Before i started to design my sketch i penned down a few visual targets...



  1.  Use colour scheme of my website (www.busy-signal.co.uk) to great effect.


  2. Have a personal message for each condition of the wind speed. i.e if the wind speed is high display " High wind speed ".


  3. Use a series of ellipses to circle each other creating a wind over the sketch
Numbers one and two were very simple, i learnt i could include colour references inside processing e.g. background(#77b2f5), very simple!


However number 3 was the hard part. It required the use of tutorials, new functions and a cheeky bit of curve making.


Here is my starting setup ..
import processing.xml.*;

XMLElement xml;
XMLElement[] title;
XMLElement[] pubDate;
XMLElement[] value;




void setup() {
frameRate(10);
smooth();
size(300,300);
background(#77b2f5);


String url = "http://x2.i-dat.org/archos/archive.rss?source=bms_.WindSpeed";

XMLElement xml = new XMLElement(this, url);
title = xml.getChildren("channel/item/title");
pubDate = xml.getChildren("channel/item/pubDate");
value = xml.getChildren("channel/item/description");

}

Now obviously one ellipse rotating is not going to look good or look anything like a tornado so i decided to use 3, which means declaring 3 float variables an assigning each ellipse a different id.


float z = 255;
float x = 255;
float y = 255;


I knew i had to have an if statement to give two or more different experiences of the sketch depending on the wind speed, but the hard part was trying to figure out how to make the circles rotate and move across the screen.




Change of idea and XML!


After messing around with a few simple sketches i decided not to bother with "bms_.OutAirHum" from the RSS feed. I looked over the course of 5 days and saw that it doesn't change much. So instead im going to use "bms_.WindSpeed" which does change considerably.


I found importing this to processing was no difficult task by using this code...


import processing.xml.*;


XMLElement xml;
XMLElement[] title;
XMLElement[] pubDate;
XMLElement[] value;

followed by this in the setup...


String url = "http://x2.i-dat.org/archos/archive.rss?source=bms_.WindSpeed";
XMLElement xml = new XMLElement(this, url);
title = xml.getChildren("channel/item/title");
pubDate = xml.getChildren("channel/item/pubDate");
value = xml.getChildren("channel/item/description");

To see if the data was entering processing  i used the "println" function so that i could view the data in the output window. Great that was all working, now the hard part what to do with this data?


So i wanted to make my visualisation a bit more advanced. Something cool to look at. However my processing knowledge DID lack. However this site http://www.learningprocessing.com/     has helped me and im sure many others with this assignment.


My idea in my head was simple, create a tornado in processing! The hard way of doing this would obviously be a mass particle swarm turning and changing in speed depending on the windspeed.

The easy way? a series of ellipses turning close together, making it also look like a fan on a A/C unit.

Monday 12 April 2010

Spit and polish!


well its finally done!
www.busy-signal.co.uk

 As you can see the twitter API widget now works . Its frustrating that i spent a good 2 hours scouring my code as to why it wasn't working but all it needed was the crossdomain.xml file placing in the root directory and the html file in the same folder as the .swf and .php files.( Thanks to luke mears for this)

The final design changes have also taken place merely by cleaning up my code in all html files and changing the design slightly but its now at a state to which im completely satisfied!

thirdly  my contact form now launches a mail client that the user has installed using this bit of code " action="mailto:benquinney1@googlemail.com" . Ideally i want a CGI script so that it sends straight to my email from the form im still in the process of learning how to do this, but it should be up and running.

Lastly Jquery has proven a struggle so im teaching myself it so that i can make a nice slideshow for my portfolio. Until then im working on a simple css/html based scrollable slideshow which will be online very soon. 

Sunday 11 April 2010

updates and security issues!


So they saying goes "your, your own worst critic" well its prove true because ive changed the design again.
see here....



  1.   The blog feed is no encapsulated in a rounded rectangle all thanks to new CSS techniques.
    "-moz-border-radius: 10px; -webkit-border-radius: 10px;"

  2. Ive got rid of all the unnecessary brushes i had in the background and created a more clean, simplistic layout in doing so.

  3. the header has changed slightly to match the layout and give a design feel

Im very proud of how its turned out, however flash is disagreeing with my plans.

Ive crated my own twitter widget see here...
 The three dark blue boxes are supposed to output my three latest tweets, and in flash it has worked perfectly.
However due to security issues in flash, i have to make my widget read twitter via a php file. This is where the problem starts. Every browser throws out this error
"Error #2044: Unhandled ioError:. text=Error #2032: Stream Error. URL: http://busy-signal.co.uk/twitter.php
    at Twitter_fla::MainTimeline/Twitter_fla::frame1() "
Apparently its due to my crossdomain.xml file not being in the root directory but ive fixed this and im still getting errors. For the moment im going to leave it alone as i fear i might throw my laptop in the bin if i proceed!


Lastly im in the process of working on a jquery sideshow for my portfolio, but Jquery is being jQUEER and refuses to work!









Friday 9 April 2010

update! 09/04/10.

So its been a hard few days with all the assignments going on. However i had had enough of looking at my website and not seeing my blog on it. obviously i needed to convert my blogger to wordpress but my web hosting "heart internet" has made this a harder task than it needs to be. It does support wordpress and mySql databases but makes the process of getting them on there a nightmare. So i decided to use feedburner to display the 15 most recent posts on my website, i also changed the blogger layout to match that of my website if the reader decides to explore my blog further.

Secondly my Facebook and Twitter have been syndicated allowing me to post across two platforms of social networking while only being logged into one! Making for a more fluid and singular online presence.

Lastly my website has undergone its final designchanges to a standard of which i enjoy and it now has a favicon in the top left corner of the address bar!

A very successful day....... in some regions!

/end success
}

I SUCK!

After looking at many examples of processing it would seem that to do anything worth looking at you need some UBER maths skills, of which i do not have.

Most of my ideas has failed to make the transition from my head to processing so its time to keep it simple.

I intend to have a singular ellipse moving around the stage to create a hazy smoke effect in its path. The more clear the view the less humid it is outside, and vice versa. Lets crack on!

Time to Visualize!


So it's time to crack on with the processing visualisation.


Using the ARCH-OS RSS feed i need to visualize the data in some sort of astehically interesting way! simple! well not really, for me but still ive got a lot of ideas.


Ive decided to pick this "bms_.OutAirHum" which im gathering is the outside air humidity.


The reason i picked this data source is because anything like rain, heat or wind speed is quite easily to visualise via the effect it has on the environment. The human senses obviously know if its hot, cold or if there has been a lot or little rain. Humidity however is less obvious so i chose to visualise this in a way in which we can easily see how humid it really is.

 Ideas?!

  1. Clouds or fog?

  2. A ripple effect creating some sort of mist

  3. a wave effect with a colour range ?

    well its a working progress.....



Wednesday 31 March 2010

Testing, Testing!



Everything has finally come together and our ideas have been realised with new technology. Well
HA-ZAAA it works! Have a look at the high quality video below of our project being tested on campus. As you can see from the video below our project works perfectly, and will work in the same way in stonehouse although the light source will be the traffic light.
So then you can see the light trigger the hacked fart machine, the ardunio picking up this sound, outputting a 1 or a 0, flash picks up when a 1 is outputted and runs our animation displaying the RSS feed from the internet. So enjoy our test video!


A job well done!


SO then the time has come to finalise and test our project. To review....
Input: Traffic light - Light sensor.
Process: fart machine - sound sensor- ardunio board 
Output: computer - RSS - projector
So as mentioned we are going to use the RSS feed to display information in stonehouse. Taking the latest post's on the RSS feed on the herald website, for reasons mentioned in previous post's.


Ideas:
    Flash animation






  1. Moving text






  2. Subtle message






  3. opening curtains
Based on the time that the public are waiting at the red traffic light we know that the animation has to be relatively short but yet send a message that's memorable.  Yet if we just have standard static text the public wont be drawn to our project. Although we know that the RSS feed is going to be imported into our flash animation live! So any new post will be on our animation, therefore displaying several bits of information very quickly, in sort of a subliminal fashion, controversial?
We also have the idea of having red theatre curtains starting and ending the animation was suggested. Not only will this be a moving animation that will hold the users attention but it also will also tread water in the historic part of the Dance Academy. Something which we had dismissed previously, which is nice because now we are exploring the realms of modern and old history within stonehouse. By applying this I think our target audience now applies to all ages and people from every demographic, after all voyeurism is embedded in human nature.


So Luke used his Flash skills and put together an animation with all the specification in mind.
As you can see in the image below, the RSS has been successfully imported and ready to use in our animation. However we are also going to add our own message to the animation to make it A. send a controversial message and B. extend peoples thoughts whilst exploring the space of stonehouse.
ks to http://pixel13.lukemears.com/ for the image!














Group photo


The back of the project has been broken, our input has been decided, the communication side of things had been worked out and work had started on processing. So no we were using code of a diffrent formatt. Us idat students have been used to code of a certain ariet, this arduino/processing lark was very different, Its designed for artists which we are i suppose. 


However it seemed to go pretty swimmingly and we got on with the code straight away. What we struggled with from the offset is how to get flash to pick up the fact that the traffic light is red or not. Ardunio outputs 1 , or a 0 depending on the sound sensors/light sensors state. So we obviously need to get flash to register that somehow! Also we obviously need to import the RSS feed for use in our animation.


The setup:




The sketch:
Next it was time to start work on the actual projection!


The hard part for some.


So whilst in New York City! With the IDAT crew, jon was busy at home hacking away.(My heart goes out to you)


We now had a way to detect the traffic light turning red via arduino. However a problem had reared its ugly head.


The Problem


 Because we are now using arduino we had to find a way to communicate with the projector.
Option A. Have a wired connection. Option B. Have a wireless connection.


Option A is clearly going to require wires leading over the road and to be honest looks untidy and its very easy to see for any member of the public how its working, what a bore!

Option B presents a challenge of wirelessly communicating the Light sensor on the arduino board with the projector! 


Well we do like a challenge us IDAT lot.


So using arduino parts its easy enough to communicate wirelessly, however the parts reach up to £50 in price and well we are students! So we found out how to use a method that was cheap and provided wireless technology.




A fart machine!?
 Yes a fart machine. Via a very simpel form of toy hacking we can use a £10 fart machine in our project for more examples of it visit tinker.it’ . So Jon did a sterling job of hacking a simple toy fart machine so that instead of having a button to activate the sound, a light sensor would replace it. Yet it still worked in the same way, the wireless connection between the remote was the same but now a light sensor activated the sound as opposed to a button. I think we may have discovered cheap wireless! We would then use the sound made to set off the projection. Basically Traffic light > light sensor > fart machine > sound sensor > computer > projector.


 
 And here's a video to show it working....




So a little change, the fart machines newly hacked remote would sit inside the traffic light so that when the light comes on, it will set the fart machine off wirelessly and now the ardunio board will sense the sound and when it does it will communicate with our flash animation.



Amazing Arduino!


Wow so i literally spent a good hour maybe more on youtube looking at examples of arduino uses, and its AMAZING!
I was so impressed with the potential and power of something so small cheap and relatively simple. Most uses of course are for sheer fun and art but the ideas behind them are from intelligent minds!


Take this video for an E.G.







I love this piece! Taking something like speech and visualising it in a modern aesthetic and displaying it on a table! wow! Im really keen on this and hope to gain a much further insight. For now we as a group are keeping it simple with light sensors using and LDR( Light Dependant Resistor).


The amazing world of Audino.


So our ideas have all fallen in to place. Were happy with what we want to achieve and are ready to get down to business.
Until recently none of us had really a strong idea about how we were going to tackle what seems now an easy task but that's thanks to Lee Nutbean.
So our first challenge was to work out how we would pick up the traffic light transition to red. At this point we were introduced to arduino.


Arduino  
"Arduino is an open-source electronics prototyping platform based on flexible, easy-to-use hardware and software. It's intended for artists, designers, hobbyists, and anyone interested in creating interactive objects or environments."












Over the course of the last few weeks we've been toying about with arduino, processing and general errrm stuff. I had never seen this technology before on a hands on basis and was really keen to get to grips with it. What made it so fascinating is that all code designed for anything arduino is open source! Ideas being shared on a global scale, lovely! So a lot of inspiration was available quickly and easily. As you can see from this video some really cool things can be done.....








How do they work?




Put simply, Arduino boards do what they are told. Using processing/Arduino which is freely available on the web, users can programme a set of instructions and upload them to the boards. Users can set the code to be run automatically but what makes these boards a cool subject is that light/sound pretty much any sensor can be applied to them. So now we knew what we would be using to detect the traffic light turning red!


Now we have arudino technology geing used in our project which is not only a useful technology it is also cost effective. Now we had to think of a way of producing a cheap wireless communication between the light sensor/Arduino board to the laptop/computer we are going to use to show our Flash projection on the Dance Academy.


Monday 22 March 2010

Twitter.

Twitter is obviously a social craze, one of which all us IDAT'ers have jumped on board with.
Its easy enough to put a simple twitter widget into your website but im not happy with that.

Ive decided that im going to use my uber flash skillz ( which arent so uber) to make my own widget.

Not an amazing idea, but its going that extra step.

Blood, Sweat and Photoshop.

I wasn't happy at all with the pure beginner standard of my website. Not impressive at all. So once again i set out to redesign the whole site, with a more web 2.0 aesthetic, but while keeping the same colour scheme.

So here it is...Ta daaaaa
.

The sidebar has greatly been improved visually.
The website as a whole has a more pleasing user experience now.
I think headers are too bulky so i got rid of it and designed a new logo to replace a bulky mess.

Im still working on roll-overs, and i have a problem that when i zoom out the background moves away from the content, this being due to the background being a JPEG. Probably a simple solution, one which ill find :).

Sunday 21 March 2010

How we got this far


Our plan is a happy plan. Meaning we are happy with it.
We've gone from idea to idea and finally settled on what we want to do, but the main aim has always been to send a message that circles the idea of how controversial buildings such as our one have been in the space of stonehouse.
Over the past few lectures we've also been discussing what technologies we are going to incorporate into our project meaning how we are actually going to make our design work.
So a brief run-down...


How are we going to use data?


Using RSS ("Really Simple Syndication") is a family of web feed formats used to publish frequently updated works—such as blog entries, news headlines, audio, and video—in a standardized format.- wikipedia)




So here's a small layout of our plan....

  1.  Traffic light turns red

  2. Light sensor picks up this action

  3. wirelessly communicates with processing/audino

  4. Displays RSS feed until red light changes.

  5. RSS feed will be incorporated into a flash animation.

  6. Flash animation ends when traffic light turns from red to orange.


RSS in general is for me a piece of coding genius. Such a simple idea and integrates into browsers so well, but what made RSS so appealing is the constant updating of any given feed. Take BBC news, each headline is posted into the feed for any reader to see quickly and easily. So as luck would have it the dance academy building has its own feed here http://www.thisisplymouth.co.uk/danceacademy via the herald.
Even though its not updated daily perhaps even weekly, if it was it would provide a different experience for the public.
So we plan to project this feed in a way in which the public who see it talk about it afterwards. Perhaps simplicity could be the way forward with this?
So now we know what information we want displaying and how we want it to work. The next step is to figure out how we will do it.


Back to buisness!


Our project is now developing and evolving into something I think we as a group are all going to be proud of. In relation to previous post's our concept for the project has changed significantly.


We moved on from the Mise en Abyme effect because, while it did look cool we weren't sure how to incorporate it into our project with significant meaning. We also decided that we were going to use skills we actually had as opposed to ones we didn't HA! So we've thought about having a flash based projection involved in our work.


Next, ideas circling the history of the building also began to fade due to lack of ideas. Its also a subject we think that many of the Plymouth people may know about but aren't so up to date on recent news involving the Academy's closure etc.


Therefore we focused on the more recent state of the building as a nightclub "the Dance Academy". Drugs dancing and crime are three topics that spring to mind when the Dance Academy is mentioned so our visualisation via the projector was going to incorporate this, very controversial but yet it may send a message to the public and hopefully council, kind of what were aiming for.


Instead of having a boring and perhaps tacky looking questionnaire on a website built by us, we've decided to use RSS feeds. So we intend to use the data from an RSS feed that relates to Union Street or better yet the dance academy and project it onto the chosen location which is now point B on the map in a previous post. This location has been chosen because we want our main audience (The people driving through) to see it, so we are going to have our entire project somehow be activated by the traffic lights at the pedestrian crossing right in front of dance academy.


We want the effect that hopefully people in the cars will create a mini traffic jam due to focusing on our projection in Stonehouse. Therefore publicising our work via word of mouth instead of us trying to promote a website with a questionnaire used for data they probably don't care about. This way it happens automatically without anyone's input.


We think this is a much better approach of A. incorporating contemporary technology formats, and B. making people stop and stare, and being as that's what we are trying to achieve it makes sense.


So then....
OUR IDEA IN A BUBBLE !
  1. Taffic light at crossing turns red



  1. RSS feed is displayed.




Sounds simple, ahhh well no. Many technologies have to be incorporated to make this work, and we must choose which ones will work best, whilst bearing in mind cost and practicality. However hopefully its going to look something like this...




-
( courtesy of www.pixel13.lukemears.com)




Saturday 13 March 2010

Design update


So i went ahead and changed the website logo to match the twitter layout. Looks much better and more coherent. Also my header now doesn't use the reapeat-x function in the CSS file. Instead my header is now a singular .PNG file. Much better.






The design falls into place




Check the new design yo, nothing amazing or even that special but ive got a colour scheme that a like and think works. The header will need more work it doesn't exactly fit, in fact after taking this very screen shot its easy to see that the logo needs an entire rework but still a vast improvement from how busy signal 1.0 looked.

As for the code itself, using twitters website ive incorporated my latest tweets into the sidebar of all my pages.
The script....
<script>
new TWTR.Widget({
  version: 2,
  type: 'profile',
  rpp: 6,
  interval: 6000,
  width: 250,
  height: 300,
  theme: {
    shell: {
      background: '#1590ce',
      color: '#fafafa'
    },
    tweets: {
      background: '#151a4d',
      color: '#f704ff',
      links: '#1590ce'
    }
  },
  features: {
    scrollbar: true,
    loop: false,
    live: false,
    hashtags: true,
    timestamp: true,
    avatars: true,
    behavior: 'all'
  }
}).render().setUser('busysuk').start();
</script>


Im in the process of converting my blog to Wordpress so that should be on the website very soon. stay tuned....





Saturday 27 February 2010

Around the world with data part 3.


The amazing world of GPS(Global positioning system):




Obviously i knew what GPS was but in terms of anything related to what we were doing i wasn't sure. I started thinking about how you could use GPS for all sorts of interesting projects and data visualisations, but first i had a deeper look at GPS itself.


How does GPS work?


Where is it used?
well from the map you could gather that its just used as a part of in flight technology. At one point in time i gather that to be true, however in modern society it is used to navigate roads, or even walk from one point to another using smart phones. 
GPS itself seems to be involving itself in our lives bit by bit, its even possible to know the location of anyone in the world using phone to phone communication, but its something that isn't widely used due to a voyeuristic nature.


However when GPS is used to explore space in can tell an interesting tale. Referring to my previous post, it was easy to see the small routes humans had taken in their own living room, but a viewer could look at it and say " ahh it this point mum went out the room and maybe made a cup of tea for her child". Its like the infamous saying, a picture says a thousand words.
Google maps is by now infamous as well as earth and streetview. It is google streetview that captivates me the most. Obviously it is not live GPS tracking but at some point it would have been, and now that route is available for everyone in the world to see. So if we take union street specifically our groups project area Dance Academy, imagine if we could use streetview to involved some sort of Augmented Reality to send a message to the public about its history. Its an idea which we got excited about but knew pretty much from the offset it was not going to happen. Due to software restrictions, time and overall publicity of the project.
It is clear thoguh that GPS is the most technologically advanced way of exploring space and i became very interested in how GPS is used as space exploration technique and found a very interesting example. I was amazed at the scale of flights that take place daily, and without this visualisation it would be hard to grasp that many planes in the air at once just by numeric data.





However some people take a very different approach to now a day to day technology. Take the example below. A person has noted positions with GPS and created a sort of art on a space that you would not expect. Such a simple yet interesting concept, i think this art can be taken alot further!




Around the world with data part 2.


Data representation:


Data of any form can be visualised in so many different ways. A lot of the time it is made into simple graphs so that the level of understanding is easy. However it doesn't take a genius to realise that these graphs and standard methods can be boring and aesthetically crap! So i decided to see what i could find on the internet.

WOW! now that's what i want to see, info graphics that represent data in a stylish way. The top most map represents connections between different subjects and i like how its easy to see where the dense areas of information are. The colour coding is obviously the key to this map.


The second one is key to this module as it is clearly exploring space. I don't know how exactly it was mapped, it could have simple been the creator noting down the positions of his subjects or even better some sort of gps. The title of the graphic says "Maps that tell tales" which is very true of this map. Using something as simple as GPS tracking could tell a story in itself, something of which i will explore in my next post.However its an amazing way to explore such a small space, and i plan to use this style of visualisation in future projects!



The last map is also very cool, although not as explanatory. Yet this is what makes for an interesting visualisation. It is all about the interpretation of this graph and that's what i like. Do we as humans need to have words displayed to us to understand something's meaning? I think not. In the contemporary Digital art world it think its clear that many individuals and artists alike are trying to create meaning within pictures that are open to interpretation, something that has been in the arts for millennia.
 Our group is exploring stonehouse with collected data. Data that could be used in a graph such visualisation such as these. We did consider taking these and creating something similar to project onto
Dance academy but, the question was how would we make it live? and being as the data wont have changed since we collected it, it wouldn't make for an interesting animation. Back to the drawing board with the projection part of our project i guess.


                                                                                      

Around the world with data.


Over the past few lectures we've been learning a deeper understanding of how data is and can be represented in a physical form. Specifically focussing on technology data such as Wi-fi. So i thought i would do a nice big post on what i found to be interesting.

Augmented reality(AR): "Its not there but its there"
Augmented reality is a technology of which we are going to see alot mroe of in the coming years. It is blurring the line between whats real and what is computer generated. Video games in particualr are set to fimrly adopt this technology as we expect more immersive gaming.
However AR is here now in a primitive version on most smart phone's in the form of applications.
Its used for location specific data to tell the user where they are via a camera feed but not only that let them know where say the nearest restaurant is and how many miles.
In relation to our own project it would be so good if we could have the news that has surrounded the Dance Academy over the years visible when filmed on a  smart phone, but yet our idea is going to have to avoid this.


When i first saw this video, i thought that AR was a relatively new way of thinking and exploring space.


Obviously the WTC is no longer there, but using compass and GPS technology it creates a filed of vision to how the trace centre would look in perspective to where the user of the phone is standing. This example is quite eerie but does justify how the technology can be used.

Upon deeper exploration i found that AR is in fact used in sports coverage all the time. It is used for object tracking such as a football moving. With that in mind i quickly realised that this technology was being used all the time and in fact the term was coined as early as 1990.

It would be amazing to use something like this in our project, however the time and skills  needed for it, are ones we just don't have. However as i progress through my years at university i hope to use it in future projects as i had many of ideas that included AR for this first project of exploring stonehouse.


The following video was very interesting. Simply it is two men playing squash but with computer generated content on top of the actual reality , Augmented reality.
After viewing this video i think the direction this technology is heading is advertising. E.G. when people are using a more advanced version of Googles street view service they will see posters for products on street corners or on buildings that aren't really there. Its a good idea, as businesses can start to capitalise the vast amount of people using navigation technology.

Busy-signal 2.0


Now that I've been a student of IDAT for almost a year, I can see how my website needs a considerable upgrade in both aesthetics and technology.

I've almost finished a brand new design for my website, which I think will do justice for my creativity, although coding it is proving a bit tough :\. However it incorporates my online presence in one sleek easy to navigate site.

My old site included:

  • XHTML

  • CSS

  • Javascript
My new site will add to this simple coding elements by using:

  • Facebook and twitter API code

  • Php tests

  • jQuery

Using these new elements my site will show my latest Tweet's, Blog posts, and a transitional image slideshow will will show my portfolio using jQuery.

Tuesday 23 February 2010

Hertzian space, does it hurt?


What is it?






Hertzian space is a subject which exists in the realm of society but yet is invisible to our own eye's.
To be more specific it is the collection of waves and frequency's, e.g. Bluetooth, radio and Wi-Fi that surround our very existence. Millions upon millions of packets of data being sent from building to building, city to city through our own bodies and its something we rely upon.


The safety of such things such as bluetooth can be questionable, but many remain care free. However technologies with huge data ranges that are capable of passing through buildings and the human body does have its doubters.


With that said it is a very interesting subject and can be visually represented for anyone to see.




How can we see it?




Using our own project site (Union street) as a starting point we ventured along this road using a phone with such Wi - fi software as Kismet to create a data map that showed signal strength and weakness. The map below also shows the security type of the Network. From the map it was easy to see the data that makes up Hertzian space. Note that this is only one device (Wi-fi) If mediums such as bluetooth and radio were included in this map I am sure that the map would be full of colours.


Key:
- Red circle = Weaker signal
- Green circle = Stronger signal
- Padlock = secure network







Monday 1 February 2010

Stage 3: Conclusion.

Making my game has been a challenge, ive realised that flash can be a very complex programme. However along the way i have learnt so much including how to use packages public and private, how to dvelop a more organised game by using separate action-script files and how to use an external data source.



Where could i improve?



  1.  An introduction screen with a play button would have been good but with the use of packages and a singular frame it became confusing as to how to code it.
  2. A retry button on game over, but the same reason as above i failed to include it.
Overall though i am very impressed with how my game has turned out. It met the criteria set out for the task and i met my design brief. After now having spent a serious amount of time on flash i think id like to devlop my understanding further for future projects.


Thanks to:
Chris saunders
http://www.good-tutorials.com/tutorials/flash
http://www.pixel2life.com