PageMove
PageMove
Exclusive App for PAGE Magazine

Client:
PAGE / Ebner Verlag
For the 01/2012 issue of the design publication PAGE, we were invited to design the magazine’s cover dedicated to »motion design.« In so doing, we tried not to visualize the typical image most people associate with motion design but to approach the subject differently. In any case, such a task begs the question how can a movie or animation best be illustrated in a static image. We therefore reinterpreted the idea and, instead of “motion design”, we called it “design through motion”. This initial idea finally resulted in a cover design that was generated through movement itself.
Using an iOS application – created especially for the project using openFrameworks – we analyzed and interpreted the data of the iPad’s motion sensors and created a way to generate the cover directly by means of the user‘s movement in space.
When the user starts up the iPad app, he initially only sees a blank page with the headline and the PAGE logo. If one begins to move the cover the movement immediately starts to leaves traces on the white screen. Similar to using a brush on a three-dimensional canvas, one draws forms and colors on the cover using the iPad itself. The motion sensors hereby measure the position and acceleration and transform even the slightest movement into real-time graphics. At any time one can stop the app and view the current cover image.

The motion is frozen into a kind of motion sculpture or an image of the motion sequence. The final design can be exported directly from the application. Thus, the creation of the image takes place solely on the iPad and the virtual cover.


Available on the iPhone / iPad Appstore
Covers
Using an iOS application, developed especially for the creation of the cover, we designed an edition of three magazine covers by analyzing and interpreting the data of the iPad’s motion sensors and turned them into abstract visuals of the user‘s movement in space.

For a content overview visit www.page-online.de
Making of



The first prototype was developed in Processing. The movement data was transferred directly to the application using a connected iPhone via WLAN and the OSC protocol. This allowed the prototype to be controlled in this early stage by movement.



We then migrated the final application to the iPad or iPhone. For the precise adjustment of motion sensors, we have initially reduced the graphics to the essentials and additionally enabled the values of the accelerometer to be output to the display.