鉴影 / Reflection

Github

  • 队伍成员:李泳锟,潘梵怡
  • 开发工具:Rhino,激光切割,Arduino,C++
  • 核心技术:蓝牙传感网,Kinect位置追踪,位置映射,像素级可视化效果
  • Team: Yongkun Li & Fanyi Pan
  • Work Platform : Rhino, Laser Cutting, Arduino, C++
  • Core Technology: Bluetooth sensor network, Kinect Position Tracking, Position mapping and projection, Pixel-level Visualization

简介 / Introduction


当下许多人们生活节奏太快,我们往往很难能够停下来,环视生活中那些值得珍视的事物,各种不同的交互背后都有其含义。

本项目旨在通过与一些摆放在起居室内日常的小物件进行交互,来提高人们对这些行为的认知。每一个交互,从开门到坐在一个椅子上,都能够触发靠墙摆放的一个微模型中的反馈。这个微模型的框架由透射率较低的亚克力板制成,以达到一种“无限”的效果,反应房间中事物的平凡无奇。

本项目中的交互包括:打开灯的同时模型中的灯也会同时打开,坐在椅子上能够触发微模型中椅子的摇动,拿起手机会导致模型中地板的红光闪烁等。

Many people put on auto-pilot mode everyday. We forget to take a moment to stop and to cherish ordinary things in our lives. All different kinds of interactions have embedded meanings behind them.

The project aims at promoting a mindful thinking practice by letting users interact with everyday objects/furniture in a typical living room setting . Every interaction- from opening the door to sit on a chair could. trigger actions in the miniature model of the room which will be placed against a wall. The miniature model of the room is frame by a see through mirror cabinet to create an effect of infinity in order to show the room being a generic existence.

Interactions are as follows- User sitting on a chair will trigger the chair in miniature model to rock. Picking up the phone will results in flashing red lights on the floor, turning on the light of the actual room will trigger the light of the model to turn on, etc.

We use several technologies including position tracking in kinect, multi bluetooth devices communication and connection, sensor network, position mapping and projection using openframework.

图集 / Images