The Urban BeeLab group is a collaboration between an artist, computer engineers, electronical engineers, a beekeeper and a herbalist.
Our group is currently working on fabricating a new iteration of a supersensory beehive. The interface for monitoring consists of an IR camera, IR led lights, a grid of temperature sensors and contact microphones.
We presume that the sensor data collected through our interface will allow us to decipher the capacities of the bee colonies in urban environments. Through the bee/ICT interaction we also intend to monitor the health of the urban ecosystem, the effects of the climate change and of airborne pollutants.
The results should be able to allow us to model and support adapted environments in which honeybee colonies can thrive again.
The Soundbeehive version 1.0 was providing data and audio processing, it was as well integrating low-energy computing for the organic electronics for sensing the microclimate in the hives.
OKNO’s team has qualified sound engineers who have already developed sound architectures with customised microphones within the beehives. This delicate task (the bee sounds are very subtle) asks for a careful and professional approach to come to a hardware architecture which is non-intrusive for the bee colony. Pre-amplifiers, contact- and omni-directional microphones are attached to and spatialised in the beehive such that the subtle bee-sounds are captured in a scientific reliable process but without hindering the colony in its daily behaviour. In the most recent setup, 10 spatialised microphones record continuously the sounds in the brood box, as well as the take off and landing sounds of the bees on the landing platform, and the movements of the bees through the different stacked boxes of the Warré beehive. The sound-files are sent via an internal network to a NAS HD for storage. A parallel computer setup assures the real-time online streaming of the bee sounds. All files are annotated with human observations. The time-stamped sensor data (temperature and humidity inside/outside are the most important) is automatically linked to the soundfiles to give a full spectrum of information.
Concerning the application of organic electronics, the OKNO-team has already installed and tested experimental setups for distributed temperature sensing in the beehive using low power thermistors.
By displaying the installations and findings of the research at art/science exhibitions, the OKNO team gives a critical comment to spark the discussion on Urban Ecologies and the disappearance of the Honeybees, hereby emphasizing the importance of experimentation and continually evaluating what is possible in a close collaboration between scientists and artists, and between interdisciplinary fields of biology, computer science and design. The exhibitions should are also addressed to non-experts, to make them understand the possible implications of a world without bees.
Soundbeehive version 2.0 will focus more on the visual side. A novel design should make it possible to create as well a beecounter as to getting an overall view on the development of the colony - this with the image of only 1 camera.
More details below.
- 8 temperature sensors are connected to Raspberry and tested (with ice cubes), they still have to be introduced in the sidewalls of the beehive
- plexiglass tunnel for beecounting is installed and should work fine
- streaming tests with IR camera (still waiting for the good lens coming from HongKong) connected to Chrome work fine
- Balt made a lowcost adjustable camera mount/support, works fine
- IR lighting is sufficient, much stronger than previous version of the hive
needs to be done:
- phidgets + software
- piezo's at entry? more audio in the hive?
- watch dog for chrome, automatic startup in case of power cuts
- landing platform
- software to read out every temp sensor + average
- visualisation temp. sensors?
- Rasp Pi problems alimentation - solved
- chrome problems
- 8 temp. sensors connected
- audio on top
- piezo's in upper box
- accellero ready
- IR cam lens 3.6mm
- Led lights in windowframe
⇒ temp. sensors visualising in opensensordata + identification (1→8) and calculations
⇒ accelerometer, code (frequencies and vibrations), visualisation in graph
via bee knees/receptors and antennae/receptors
phidget logging info:
Later kunnen we op die rauwe data wel filters loslaten.