Transcribe is a speculative project that recreates plant movements in different weather conditions, spotlighting the often-overlooked phenomena of wind and sunlight. By merging nature and technology, it highlights subtle yet impactful environmental behaviors.
Its core focus, Translating, decodes plant “language” by mapping leaf angles and positions over a 24-hour period based on weather inputs. The project decodes plant “language” by mapping leaf movements over 24 hours, inspired by the 1800s book Elements of Botany, which observed how leaves optimize sunlight exposure. It also explores plant responses to rain and wind.
Users interact through a p5.js environment, requesting city weather via voice. Weather data, processed through APIs, drives an Arduino-controlled robot to simulate plant-like movements. Real-time visualizations reflect these adaptive responses to environmental changes.
The sketches here are exploring how to create a robot that mimics the actions of plants while also emphasizing its own mechanical nature. The robot is designed to replicate the movements of a plant using a series of servo motors to act as joints or as one would say branching of plants. Each joint is controlled by a servo motor, allowing for precise movement. The leaf is replicated through the opening and closing of wings, which are also controlled by the servo motors.
Care is taken to replicate all the axial movements a plant might make when responding to the weather. However, I do recognize the failure of the robot in replicating the fragile nature of plants. Through user experiences and my own observations, its a valid question to ask, was this the best approach to learn about plant behavior?
User Interaction: It begins in the p5.js environment, where users click a “speak” button and request weather information for a specific city via voice. The speech is converted to text using a voice recognition API.
Geocoding: The city name is sent to the Open-Cage Geocoding API, which converts it into geographic coordinates.
Data Retrieval: These coordinates query the OpenWeather API, retrieving details like weather conditions, wind speed, and local time.
Data Processing: Algorithms process the weather data to determine robot movements that simulate plant responses, such as swaying in wind or adjusting leaves for sunlight.
Arduino Control: The processed data is sent to an Arduino microcontroller via the p5.js serial library, which controls servo motors to mimic plant-like motions.
Visualization: Real-time visualizations display the robot’s adaptive movements, allowing users to observe plant-inspired reactions to changing environmental conditions.
I wanted to explore the convergence of nature and technology, offering users a multi-sensory experience of natural systems through a plant-robot hybrid body. By combining physical interactions with dynamic visualizations, the goal was to help users gain a deeper understanding of nature by leveraging available tools and technology.
However, based on feedback, I am now uncertain about the robot’s role in fostering an understanding of plants. It seems to function more as a weather-telling machine than as a tool for learning about plant behavior. While users enjoyed the interaction, it did not significantly enhance their understanding of plants.
The rigidness of a robot took away the idea of how plants are totally immersed in their surroundings. As I would put it, the robot worked well as a machine but failed at replicating a hypersensitive being we know as “plant’.