Art from code - Generator.x
Generator.x is a conference and exhibition examining the current role of software and generative strategies in art and design. [Read more...]
 
Tag: code
 

This page shows all the posts tagged with . Other popular tags are: , , , , , , , , , . You can also browse all tags together.

Nodebox projects by Tom de Smedt

Tom de Smedt: Supercurly / Photobjects (done with NodeBox)

NodeBox was blogged here last year, but checking in on the project there are a number of developments that warrant an update. To refresh your memory, NodeBox is a code tool for visuals based on the Python language. It is being developed by Lucas Nijs, Frederik De Bleser and Tom De Smedt, all teachers at St-Lucas Art College, Antwerp.

Taking inspiration from Processing, NodeBox lets the user get to work coding graphics using a simplified syntax, without worrying about the underlying technology. Unlike Processing, NodeBox is based on vector graphics rather than pixels. That means that it is an excellent tool for exploring 2D graphics intended for print, and in particular typographic experiments. The exported results take the form of PDF files, ready for use in Adobe Illustrator or any professional vector graphics package. NodeBox can also export Quicktime movies for animations.

The NodeBox Gallery shows off some good-looking sketches. Tom de Smedt has published two good examples: Supercurly uses the modular font Superveloz by Andreu Balius to construct organic compositions, while Photobjects is a database of images which can be queried for images connected to certain keywords. These are then used to create randomized collages of images.

NodeBox is now up to version 1.0 release candidate 7, and is sophisticated enough to count as a real production tool. Sadly it is only available for Mac OS, but the source is released under the MIT license in case anyone wants to have a go at porting it. NodeBox is based on DrawBot by Just Van Rossum.

Related links:

  • research.nodebox.net: The NodeBox research wiki.
  • Trapdoor: More projects by the NodeBox team and associates.
  • Replica: Even more projects and texts by the NodeBox team and associates.
 

HTML and markup languages like XML describe documents as hierarchies of tags, in what is called a Document Object Model. This structure can be visualized as a graph.

Websites as Graphs (by Sala of Onethousandpaintings.com) takes a web page URL as input, and outputs a graph of the underlying HTML structure. Used on any large content site like CNN or BoingBoing, it reveal the underlying logic of presentation used to build those pages. Related information form clusters, with color codes revealing a tendency towards table- or CSS-based design (the former being a no-no, obviously) as well as density of images, links etc.

While the graphs make for interesting images, it is still hard to make hard and fast assumptions about the page in question only by looking at the graph. But a well-structured document will always reveal itself as such, as will badly-structured documents. Websites as Graphs should be of interest to anyone who has tried to define a page structure, particularly if that structure conforms to the current CSS-based ideal of “logic-not-presentation” style of web design.

The source code for Websites as Graphs is freely available for download. It was built with Processing, using the Traer.Physics and HTMLParser.

Update: Markavian has hacked up a remix version which allows you to browse the tag structure interactively and even follow links to new documents. To use it, point your browser to a URL in the following format:

http://mkv25.net/applets/mkv_htmlgraph/getDataFromURL.php?url=http://www.mysite.com/

“mysite.com” should obviously be replaced with whatever URL it is you want to explore.

Relevant links:

 

The EXTEND workshop with Casey Reas, Ben Fry, Zach Lieberman and yours truly is now underway. Today is the second day, yesterday was spent giving personal introductions and dividing the 18 participants into groups. Each day we have micro-lectures. Zach started off by talking about animation and movement, and showed some examples from his making things move workshop.

The participants have shown significant interest in data visualization, and so Ben presented some background to computational information design. He used his Linkology project as a specific example.

Casey is currently speaking about the history of Processing (traced back to ACU and other MIT projects) and how to sketch with code. He is also talking about the importance of the concept of libraries as a way of extending Processing, and in particular to bring it beyond the screen. In particular, he is demonstrating the new PDF library with some code examples that will soon be posted to the Processing site.

I will sporadically be blogging the workshop over on Code & Form, a new blog I just opened to support workshops, teaching and code experiments.

 
 

Due to the current concert tour (which is going very well, expect an update very soon) blogging has been a low priority. Here are a few interesting things we’ve noticed recently:

  • Atelier Nord has a call for participation for a workshop called The Empire’s New Clothes - Art, Fashion and Technology. The deadline is today – Monday 24 April, so if fashion is your thing hurry up and send them a CV and statement of intent. Apologies for the late post of this call
  • Switchboard is a new Processing library written by Jeffrey Crouse. It implements a general application layer for using web services with Processing. Services already implemented to varying degrees are “google, yahoo, msn, allmusic, shoutcast, foaf, and rss/atom feeds”.
  • Linkology by Ben Fry is a project for New York Magazine showing link connections between the top 50 blogs. I’ve been meaning to blog it forever, but never got around to it so I’m simply linking it here.
  • Visualcomplexity keeps adding new projects. Some new favorites are Essence of Rabbit (by our Berlin friends at Pictoplasma) and Font 004 - Community by Marian Bantjes. Interesting to see that Visualcomplexity is including projects that don’t fit a strict infoviz focus. If you haven’t checked in for a while then take a look and consider subscribing to their RSS feed. It’s well worth it.

Photos and video of the Generator.x tour should go online in the next few days.

 

The ever-trusty del.icio.us/TomC feed brings news of a debate related to the Processing or Die thread a while back. A blog post over on Grand Text Auto about a lecture by C.E.B. Reas at the Human Systems | Digital Bodies conference has drawn some interesting comments about “procedural literacy” and discussion of general terminology.

Michael Mateas, associate Professor at Georgia Institute of Technology, has posted a link to his paper "Procedural Literacy: Educating the New Media Practitioner" (PDF). In it he argues that a knowledge of computational processes (i.e. procedural literacy) is a requirement for anyone seriously intending to deal with the so-called “new media”. It’s slightly on the techy side of things, but has some interesting historical references (Papert, Kay, Nelson etc.) as well as some fresh takes on the basic problem of computing for the humanities. For instance, he proposes (writing) games as the perfect vehicle for understanding a procedural approach. Interestingly, another participant in the discussion, Ian Bogost, has a book out on MIT Press entitled Unit Operations : An Approach to Videogame Criticism.

The idea of computational literacy extends beyond what is traditionally considered code. Our favorite Norwegian blogger heroine, Jill Walker, forced her electronic literature students to learn HTML and CSS in order to set up their own blogs. While HTML lacks any active computational component, it can still potentially hold a transformative experience in terms of understanding how computers “think”. Just ask all the Myspace kids.

And of course there is always the dogmatic Open Source view as to why you should learn to code: If you can’t hack it, it will control your life.

 

An interesting link just came down Tom Carden's del.icio.us feed, by way of mflux posting it on Processing.org:

The Art in Computer Programming is an article by Andrew Hunt and David Thomas, both veteran programmers with views on how programming practices can be improved. At the core of the article is the assertion that programming can be seen as an art form, and that approaches from painting etc. can be gainfully used to improve the process of coding.

Comparing programming to art is not new. Donald Knuth’s monolithic series The Art of Computer Programming establishes the connection quite firmly, even if he uses art as a measure of quality rather than as a description of an aesthetic / critical practice. Paul Graham also seizes on the analogy to painters in his book Hackers & Painters.

Apart from some slightly distasteful analogies involving military scenarios of “hitting your target”, Hunt and Thomas have some interesting points that will be recognizable to experienced coders and newbies alike. The challenge of the blank canvas and writer’s block is familiar, as is the issue of when to stop. On these points the article gives clear and useful suggestions. The issue of “Satisfying the Sponsor” is all-important to software engineers and designers, but perhaps less critical to artists.

For another interesting take on how to program, read this quote from an interview with Bram Cohen in Wired 13.01. Cohen is the genius behind the notorious yet much admired BitTorrent filesharing protocol:

“Bram will just pace around the house all day long, back and forth, in and out of the kitchen. Then he’ll suddenly go to his computer and the code just comes pouring out. And you can see by the lines on the screen that it’s clean,” Jenna says. “It’s clean code.” She pats her husband affectionately on the head: “My sweet little autistic nerd boy.” (Cohen in fact has Asperger’s syndrome, a condition on the mild end of the autism spectrum that gives him almost superhuman powers of concentration but can make it difficult for him to relate to other people.)

Final quote: “[premature] optimization is the root of all evil.” The author of this famous quote is the afore-mentioned Donald Knuth. It was mentioned in a post over on Vogon Poetry (again found through Tom C.) The post summarizes a talk by Cal Henderson on the building of Flickr, interesting reading as it describes how to create a scalable web application almost exclusively from Open Source software.

 

With its rich content and well-implemented tagging system, del.icio.us provides a tantalizing data set for would-be information visualizers. Fortunately, the open del.icio.us API allows developers full access to the functionality of the system.

To support the recently launched Processing hacks site I have written up a quick tutorial on how to access del.icio.us with Processing. The hack uses David Czarnecki’s delicious-java library. I also added a simple hack for outputting PostScript vector files.

 

This site went live while Generator.x was having a holiday, but it deserves a repost even though it’s a few weeks old:

The ever-productive gentlemen Tom Carden and Karsten Schmidt (Toxi) have launched Processinghacks, a user-contributed Wiki intended to provide the Processing community with documentation of advanced techniques.

Processinghacks nicely fills the gap left by the lack of tutorials on the Processing site, combined with the beginner focus of the built-in examples. While a lot of answers are available on the forums, they are sometimes out of date or hard to find. Processinghacks provides details on specialized techniques that are beyond the scope of the core Processing project, such as integrating Processing with Java or hacking the source code itself.

A big plus is that this effort is completely independent of Ben and Casey, which means that they can focus their energies on the core project of bringing Processing to version 1.0. For those who remember the debate brought up by Karsten a little while ago, this should set an example. Instead of just complaining about the state of things, people like Tom and Karsten are actively providing a service to the community.

Some highlights from Processinghacks:

 

Toxi aka Karsten Schmidt has been playing productive troublemaker the last few days, blogging some loose thoughts about what kind of tools and ideas are needed for a productive evolution of the computational design field. To roughly summarize: He is critical of the current state of the generative / computational scene, and the tools that are being hyped. Among his criticisms is that the work that is currently popular in the scene is often focused on immediate gratification, duplicating already existing work. It also often found lacking in niceties like software design, or even a more general understanding of good coding practices.

Karsten used Processing as the basis of his statements, pointing out that the procedural syntax of Processing could educate lazy coders and ultimately a dead-end for serious users of the tool. Not surprisingly, this has caused an explosive (but not incendiary) discussion over on the Processing forums. Ultimately, the discussion deals with the theoretical foundation for a tool like Processing, but also with possible future directions for the project. It’s on the techy side, but relevant for anyone who fancies her/himself a coder or who wants to understand what makes a programming language/tool capable of maximum freedom of expression.

Be sure to also read Karsten's followup where he clarifies his position after some misunderstandings.